• Title/Summary/Keyword: Dynamic Neurons

Search Result 85, Processing Time 0.031 seconds

Genetically Opimized Self-Organizing Fuzzy Polynomial Neural Networks Based on Fuzzy Polynomial Neurons (퍼지다항식 뉴론 기반의 유전론적 최적 자기구성 퍼지 다항식 뉴럴네트워크)

  • 박호성;이동윤;오성권
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.53 no.8
    • /
    • pp.551-560
    • /
    • 2004
  • In this paper, we propose a new architecture of Self-Organizing Fuzzy Polynomial Neural Networks (SOFPNN) that is based on a genetically optimized multilayer perceptron with fuzzy polynomial neurons (FPNs) and discuss its comprehensive design methodology involving mechanisms of genetic optimization, especially genetic algorithms (GAs). The proposed SOFPNN gives rise to a structurally optimized structure and comes with a substantial level of flexibility in comparison to the one we encounter in conventional SOFPNNs. The design procedure applied in the construction of each layer of a SOFPNN deals with its structural optimization involving the selection of preferred nodes (or FPNs) with specific local characteristics (such as the number of input variables, the order of the polynomial of the consequent part of fuzzy rules, and a collection of the specific subset of input variables) and addresses specific aspects of parametric optimization. Through the consecutive process of such structural and parametric optimization, an optimized and flexible fuzzy neural network is generated in a dynamic fashion. To evaluate the performance of the genetically optimized SOFPNN, the model is experimented with using two time series data(gas furnace and chaotic time series), A comparative analysis reveals that the proposed SOFPNN exhibits higher accuracy and superb predictive capability in comparison to some previous models available in the literatures.

Involvement of NMDA Receptor and L-type Calcium Channel in the Excitatory Action of Morphine

  • Koo, Bon-Seop;Shin, Hong-Kee;Kang, Suk-Han;Jun, Jong-Hun
    • The Korean Journal of Physiology and Pharmacology
    • /
    • v.6 no.5
    • /
    • pp.241-246
    • /
    • 2002
  • We studied the excitatory action of morphine on the responses of dorsal horn neuron to iontophoretic application of excitatory amino acid and C-fiber stimulation by using the in vivo electrophysiological technique in the rat. In 137 of the 232 wide dynamic range (WDR) neurons tested, iontophoretic application of morphine enhanced the WDR neuron responses to N-methyl-D-aspartate (NMDA), kainate, and graded electrical stimulation of C-fibers. Morphine did not have any excitatory effects on the responses of low threshold cells. Morphine-induced excitatory effect at low ejection current was naloxone-reversible and reversed to an inhibitory action at high ejection current. NMDA receptor, calcium channel and intracellular $Ca^{2+}$ antagonists strongly antagonized the morphine-induced excitatory effect. These results suggest that changes in intracellular ionic concentration, especially $Ca^{2+},$ play an important role in the induction of excitatory effect of morphine in the rat dorsal horn neurons.

Learning an Artificial Neural Network Using Dynamic Particle Swarm Optimization-Backpropagation: Empirical Evaluation and Comparison

  • Devi, Swagatika;Jagadev, Alok Kumar;Patnaik, Srikanta
    • Journal of information and communication convergence engineering
    • /
    • v.13 no.2
    • /
    • pp.123-131
    • /
    • 2015
  • Training neural networks is a complex task with great importance in the field of supervised learning. In the training process, a set of input-output patterns is repeated to an artificial neural network (ANN). From those patterns weights of all the interconnections between neurons are adjusted until the specified input yields the desired output. In this paper, a new hybrid algorithm is proposed for global optimization of connection weights in an ANN. Dynamic swarms are shown to converge rapidly during the initial stages of a global search, but around the global optimum, the search process becomes very slow. In contrast, the gradient descent method can achieve faster convergence speed around the global optimum, and at the same time, the convergence accuracy can be relatively high. Therefore, the proposed hybrid algorithm combines the dynamic particle swarm optimization (DPSO) algorithm with the backpropagation (BP) algorithm, also referred to as the DPSO-BP algorithm, to train the weights of an ANN. In this paper, we intend to show the superiority (time performance and quality of solution) of the proposed hybrid algorithm (DPSO-BP) over other more standard algorithms in neural network training. The algorithms are compared using two different datasets, and the results are simulated.

Analysis of Dynamical State Transition of Cyclic Connection Neural Networks with Binary Synaptic Weights (이진화된 결합하중을 갖는 순환결합형 신경회로망의 동적 상태천이 해석)

  • 박철영
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.5
    • /
    • pp.76-85
    • /
    • 1999
  • The intuitive understanding of the dynamic pattern generation in asymmetric networks may be useful for developing models of dynamic information processing. In this paper, dynamic behavior of the cyclic connection neural network, in which each neuron is connected only to its nearest neurons with binary synaptic weights of $\pm$ 1, has been investigated. Simulation results show that dynamic behavior of the network can be classified into only three categories: fixed points, limit cycles with basin and limit cycles with no basin. Furthermore, the number and the type of limit cycles generated by the networks have been derived through analytical method. The sufficient conditions for a state vector of $n$-neuron network to produce a limit cycle of $n$- or 2$n$-period are also given. The results show that the estimated number of limit cycles is an exponential function of $n$. On the basis of this study, cyclic connection neural network may be capable of storing a large number of dynamic information.

  • PDF

Dynamical Properties of Ring Connection Neural Networks and Its Application (환상결합 신경회로망의 동적 성질과 응용)

  • 박철영
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.4 no.1
    • /
    • pp.68-76
    • /
    • 1999
  • The intuitive understanding of the dynamic pattern generation in asymmetric networks may be useful for developing models of dynamic information processing. In this paper, dynamic behavior of the ring connection neural network in which each neuron is only to its nearest neurons with binary synaptic weights of ±1, has been inconnected vestigated Simulation results show that dynamic behavior of the network can be classified into only three categories: fixed points, limit cycles with basin and limit cycles with no basin. Furthermore, the number and the type of limit cycles generated by the networks have been derived through analytical method. The sufficient conditions for a state vector of n-neuron network to produce a limit cycle of n- or 2n-period are also given The results show that the estimated number of limit cycle is an exponential function of n. On the basis of this study, cyclic connection neural network may be capable of storing a large number of dynamic information.

  • PDF

Study on the Shortest Path by the energy function in Hopfield neworks (홉필드 네트웍에서 에너지 함수를 이용한 최적 경로 탐색에 관한 연구)

  • Ko, Young-Hoon;Kim, Yoon-Sang
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.10 no.5
    • /
    • pp.215-221
    • /
    • 2010
  • Hopfield networks have been proposed as a new computational tool for finding the shortest path of networks. Zhang and Ali studied the method of finding shortest path by expended neurons of Hopfield networks. Ali Algorithm is well known as the tool with the neurons of branch numbers. Where a network grows bigger, it needs much more time to solve the problem by Ali algorithm. This paper modifies the method to find the synapse matrix and the input bias vector. And it includes the eSPN algorithm after proper iterations of the Hopfield network. The proposed method is a tow-stage method and it is more efficient to find the shortest path.The proposed method is verified by three sample networks. And it could be more applicable then Ali algorithm because it's fast and easy. When the cost of brach is changed, the proposed method works properly. Therefore dynamic cost-varing networks could be used by the proposed method.

Advanced Self-Organizing Neural Networks Based on Competitive Fuzzy Polynomial Neurons (경쟁적 퍼지다항식 뉴런에 기초한 고급 자기구성 뉴럴네트워크)

  • 박호성;박건준;이동윤;오성권
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.53 no.3
    • /
    • pp.135-144
    • /
    • 2004
  • In this paper, we propose competitive fuzzy polynomial neurons-based advanced Self-Organizing Neural Networks(SONN) architecture for optimal model identification and discuss a comprehensive design methodology supporting its development. The proposed SONN dwells on the ideas of fuzzy rule-based computing and neural networks. And it consists of layers with activation nodes based on fuzzy inference rules and regression polynomial. Each activation node is presented as Fuzzy Polynomial Neuron(FPN) which includes either the simplified or regression polynomial fuzzy inference rules. As the form of the conclusion part of the rules, especially the regression polynomial uses several types of high-order polynomials such as linear, quadratic, and modified quadratic. As the premise part of the rules, both triangular and Gaussian-like membership (unction are studied and the number of the premise input variables used in the rules depends on that of the inputs of its node in each layer. We introduce two kinds of SONN architectures, that is, the basic and modified one with both the generic and the advanced type. Here the basic and modified architecture depend on the number of input variables and the order of polynomial in each layer. The number of the layers and the nodes in each layer of the SONN are not predetermined, unlike in the case of the popular multi-layer perceptron structure, but these are generated in a dynamic way. The superiority and effectiveness of the Proposed SONN architecture is demonstrated through two representative numerical examples.

An Enhanced Counterpropagation Algorithm for Effective Pattern Recognition (효과적인 패턴 인식을 위한 개선된 Counterpropagation 알고리즘)

  • Kim, Kwang-Baek
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.9
    • /
    • pp.1682-1688
    • /
    • 2008
  • The Counterpropagation algorithm(CP) is a combination of Kohonen competition network as a hidden layer and the outstar structure of Grossberg as an output layer. CP has been used in many real applications for pattern matching, classification, data compression and statistical analysis since its learning speed is faster than other network models. However, due to the Kohonen layer's winner-takes-all strategy, it often causes instable learning and/or incorrect pattern classification when patterns are relatively diverse. Also, it is often criticized by the sensitivity of performance on the learning rate. In this paper, we propose an enhanced CP that has multiple Kohonen layers and dynamic controlling facility of learning rate using the frequency of winner neurons and the difference between input vector and the representative of winner neurons for stable learning and momentum learning for controlling weights of output links. A real world application experiment - pattern recognition from passport information - is designed for the performance evaluation of this enhanced CP and it shows that our proposed algorithm improves the conventional CP in learning and recognition performance.

Neuroglial Cells : An Overview of Their Physiological Roles and Abnormalities in Mental Disorders (신경아교세포의 정상 기능과 정신장애에서 나타나는 신경아교세포 이상에 대한 고찰)

  • Lee, Kyungmin
    • Korean Journal of Biological Psychiatry
    • /
    • v.22 no.2
    • /
    • pp.29-33
    • /
    • 2015
  • The brain maintains homeostasis and normal microenvironment through dynamic interactions of neurons and neuroglial cells to perform the proper information processing and normal cognitive functions. Recent post-mortem investigations and animal model studies demonstrated that the various brain areas such as cerebral cortex, hippocampus and amygdala have abnormalities in neuroglial numbers and functions in subjects with mental illnesses including schizophrenia, dementia and mood disorders like major depression and bipolar disorder. These findings highlight the putative role and involvement of neuroglial cells in mental disorders. Herein I discuss the physiological roles of neuroglial cells such as astrocytes, oligodendrocytes, and microglia in maintaining normal brain functions and their abnormalities in relation to mental disorders. Finally, all these findings could serve as a useful starting point for potential therapeutic concept and drug development to cure unnatural behaviors and abnormal cognitive functions observed in mental disorders.

Obstacle Avoidance Using Modified Hopfield Neural Network for Multiple Robots

  • Ritthipravat, Panrasee;Maneewarn, Thavida;Laowattana, Djitt;Nakayama, Kenji
    • Proceedings of the IEEK Conference
    • /
    • 2002.07b
    • /
    • pp.790-793
    • /
    • 2002
  • In this paper, dynamic path planning of two mobile robots using a modified Hopfield neural network is studied. An area which excludes obstacles and allows gradually changing of activation level of neurons is derived in each step. Next moving step can be determined by searching the next highest activated neuron. By learning repeatedly, the steps will be generated from starting to goal points. A path will be constructed from these steps. Simulation showed the constructed paths of two mobile robots, which are moving across each other to their goals.

  • PDF