• Title/Summary/Keyword: Sigmoid

Search Result 471, Processing Time 0.036 seconds

GENERALIZED SYMMETRICAL SIGMOID FUNCTION ACTIVATED NEURAL NETWORK MULTIVARIATE APPROXIMATION

  • ANASTASSIOU, GEORGE A.
    • Journal of Applied and Pure Mathematics
    • /
    • v.4 no.3_4
    • /
    • pp.185-209
    • /
    • 2022
  • Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝN, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are point-wise and uniform. The related feed-forward neural network is with one hidden layer.

Pattern Recognition Analysis of Two Spirals and Optimization of Cascade Correlation Algorithm using CosExp and Sigmoid Activation Functions (이중나선의 패턴 인식 분석과 CosExp와 시그모이드 활성화 함수를 사용한 캐스케이드 코릴레이션 알고리즘의 최적화)

  • Lee, Sang-Wha
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.3
    • /
    • pp.1724-1733
    • /
    • 2014
  • This paper presents a pattern recognition analysis of two spirals problem and optimization of Cascade Correlation learning algorithm using in combination with a non-monotone function as CosExp(cosine-modulated symmetric exponential function) and a monotone function as sigmoid function. In addition, the algorithm's optimization is attempted. By using genetic algorithms the optimization of the algorithm will attempt. In the first experiment, by using CosExp activation function for candidate neurons of the learning algorithm is analyzed the recognized pattern in input space of the two spirals problem. In the second experiment, CosExp function for output neurons is used. In the third experiment, the sigmoid activation functions with various parameters for candidate neurons in 8 pools and CosExp function for output neurons are used. In the fourth experiment, the parameters are composed of 8 pools and displacement of the sigmoid function to determine the value of the three parameters is obtained using genetic algorithms. The parameter values applied to the sigmoid activation functions for candidate neurons are used. To evaluate the performance of these algorithms, each step of the training input pattern classification shows the shape of the two spirals. In the optimizing process, the number of hidden neurons was reduced from 28 to15, and finally the learning algorithm with 12 hidden neurons was optimized.

Quadratic Sigmoid Neural Equalizer (이차 시그모이드 신경망 등화기)

  • Choi, Soo-Yong;Ong, Sung-Hwan;You, Cheol-Woo;Hong, Dae-Sik
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.36S no.1
    • /
    • pp.123-132
    • /
    • 1999
  • In this paper, a quadratic sigmoid neural equalizer(QSNE) is proposed to improve the performance of conventional neural equalizer in terms of bit error probability by using a quadratic sigmoid function as the activation function of neural networks. Conventional neural equalizers which have been used to compensate for nonlinear distortions adopt the sigmoid function. In the case of sigmoid neural equalizer, each neuron has one linear decision boundary. So many neurons are required when the neural equalizer has to separate complicated structure. But in case of the proposed QSNF and quadratic sigmoid neural decision feedback equalizer(QSNDFE), each neuron separates decision region with two parallel lines. Therefore, QSNE and QSNDFE have better performance and simpler structure than the conventional neural equalizers in terms of bit error probability. When the proposed QSNDFE is applied to communication systems and digital magnetic recording systems, it is an improvement of approximately 1.5dB~8.3dB in signal to moise ratio(SNR) over the conventional decision feedback equalizer(DEF) and neural decision feedback equalizer(NDFE). As intersymbol interference(ISI) and nonlinear distortions become severer, QSNDFE shows astounding SNR shows astounding SNR performance gain over the conventional equalizers in the same bit error probability.

  • PDF

Multi-labeled Domain Detection Using CNN (CNN을 이용한 발화 주제 다중 분류)

  • Choi, Kyoungho;Kim, Kyungduk;Kim, Yonghe;Kang, Inho
    • 한국어정보학회:학술대회논문집
    • /
    • 2017.10a
    • /
    • pp.56-59
    • /
    • 2017
  • CNN(Convolutional Neural Network)을 이용하여 발화 주제 다중 분류 task를 multi-labeling 방법과, cluster 방법을 이용하여 수행하고, 각 방법론에 MSE(Mean Square Error), softmax cross-entropy, sigmoid cross-entropy를 적용하여 성능을 평가하였다. Network는 음절 단위로 tokenize하고, 품사정보를 각 token의 추가한 sequence와, Naver DB를 통하여 얻은 named entity 정보를 입력으로 사용한다. 실험결과 cluster 방법으로 문제를 변형하고, sigmoid를 output layer의 activation function으로 사용하고 cross entropy cost function을 이용하여 network를 학습시켰을 때 F1 0.9873으로 가장 좋은 성능을 보였다.

  • PDF

Simplified neuron functions for FPGA evaluations of engineering neuron on gate array and analogue circuit

  • Saito, Masayuki;Wang, Qianyi;Aoyama, Tomoo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.157.6-157
    • /
    • 2001
  • We estimated various neuron functions to construct of engineering neurons, which are the combination of sigmoid, linear, sine, quadric, double/single bended, soft max/minimum functions. These combinations are estimated by the property on the potential surface between the learning points, calculation speed, and learning convergence; because the surface depends on the inference ability of a neuron system; and speed and convergence are depend on the efficiency on the points of engineering applications. After the evaluating discussions, we can select more appropriate combination than original sigmoid function´s, which is single bended function and linear one. The combination ...

  • PDF

Sigmoid Colon Perforation by a Distal Ventriculoperitoneal Shunt Catheter (뇌실복강간단락술 원위 도관에 의한 구불결장의 천공)

  • Shin, Dong-Keun;Kim, Seong-Ho
    • Journal of Yeungnam Medical Science
    • /
    • v.25 no.2
    • /
    • pp.171-174
    • /
    • 2008
  • We report an unusual case of a sigmoid colon perforation after ventriculoperitoneal shunt surgery. Distal catheters are known to cause perforation in the setting of colonoscopy. The exact pathogenesis of this complication is not clear, but it can cause serious complications. Hence, patients require prompt and aggressive management, including laparotomy with bowel wall repair, catheter removal, and antibiotic therapy.

  • PDF

Adaptive neural control for compensation of time varying characteristics (시스템의 시변성을 보상하기 위한 신경회로망을 이용한 적응제어)

  • 이영태;장준오;전기준
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10a
    • /
    • pp.224-229
    • /
    • 1992
  • We investigate a neural network as a dynamic system controller when system characteristics are abruptly changing. The shape of sigmoid functions are determined by autotuing method for the optimum sigmoid function of the neural networks. By using information stored in the identifying network a novel algorithm that can adapt the control action of the controller has been developed. Robustness can be seen from its ability to adjust large variations of parameters. The potential of the proposed method is demonstrated by simulations.

  • PDF

CONSTRUCTION OF POSITIVE INTERPOLATION FUNCTIONS FOR DIFFUSION TENSOR

  • Shim, Hong-Tae
    • Journal of applied mathematics & informatics
    • /
    • v.23 no.1_2
    • /
    • pp.563-570
    • /
    • 2007
  • There has been a considerable research interest in medical communities for neuronal fiber tracking with magnetic resonance diffusion tensor imaging(DTI). DTI data have abundant structural boundaries that need to be preserved during interpolation to facilitate fiber tracking. Sigmoid function has been used in recent papers but the sigmoid function still is not good enough to be served as an positive interpolation in mathematical point of view. In this paper, we construct and provide two families positive cardinal interpolation functions.

Segmental Dilatation of the Sigmoid Colon : A Rare Cause of Chronic Constipation (S상 결장 분절확장증)

  • Park, Woo-Hyun;Choi, Soon-Ok;Paik, Tae-Won;Lee, Hee-Jung;Suh, Soo-Jhi;Kim, Sang-Pyo
    • Advances in pediatric surgery
    • /
    • v.1 no.1
    • /
    • pp.68-72
    • /
    • 1995
  • Segmental dilatation of the colon is a very rare disease entity of unknown etiology and may mimic Hirschsprung's disease. It is characterized by dilatation of a segment of the colon of variable length with obstruction due to lack of peristalsis in a normally innervated intestine. Recently authors experienced a case of segmental dilatation of the sigmoid colon in a 6 month-old male, who presented with severe constipation, abdominal distention, and abdominal mass since 2 months of age. Down's syndrome and congenital nystagmus were associated. Barium enema demonstrated focal dilatation of the sigmoid colon, but the rectum and descending colon proximal t o the affected colon were of normal caliber. Rectal suction biopsy with acetylcholinesterase staining was normal and anorectal manometry showed normal rectosphincteric reflex. At operation, there was a massively dilated and hypertrophied sigmoid colon with increased tortuous serosal vessels, measuring 15 cm in length and 10 cm in width. Teniae coli were identifiable in the affected segment. Frozen section biopsies at the proximal, affected, and distal colon showed ganglion cells. Descending loop colostomy was constructed initially and segmental resection and end to end colocolostomy were carried out 3 months later. Final histologic examination showed 1) normal colonic mucosa with ganglion cells, 2) prominent submucosal fibrosis and marked muscular hypertrophy, 3) unremarkable acetylcholinesterase activity and immunohistochemical findings against S-100 protein. On 8 months follow-up, he has been doing well and moves bowels 1-2 times daily.

  • PDF

Supervised Learning Artificial Neural Network Parameter Optimization and Activation Function Basic Training Method using Spreadsheets (스프레드시트를 활용한 지도학습 인공신경망 매개변수 최적화와 활성화함수 기초교육방법)

  • Hur, Kyeong
    • Journal of Practical Engineering Education
    • /
    • v.13 no.2
    • /
    • pp.233-242
    • /
    • 2021
  • In this paper, as a liberal arts course for non-majors, we proposed a supervised learning artificial neural network parameter optimization method and a basic education method for activation function to design a basic artificial neural network subject curriculum. For this, a method of finding a parameter optimization solution in a spreadsheet without programming was applied. Through this training method, you can focus on the basic principles of artificial neural network operation and implementation. And, it is possible to increase the interest and educational effect of non-majors through the visualized data of the spreadsheet. The proposed contents consisted of artificial neurons with sigmoid and ReLU activation functions, supervised learning data generation, supervised learning artificial neural network configuration and parameter optimization, supervised learning artificial neural network implementation and performance analysis using spreadsheets, and education satisfaction analysis. In this paper, considering the optimization of negative parameters for the sigmoid neural network and the ReLU neuron artificial neural network, we propose a training method for the four performance analysis results on the parameter optimization of the artificial neural network, and conduct a training satisfaction analysis.