• Title/Summary/Keyword: Jordan Recurrent Neural Network

Search Result 13, Processing Time 0.024 seconds

A New Thpe of Recurrent Neural Network for the Umprovement of Pattern Recobnition Ability (패턴 인식 성능을 향상시키는 새로운 형태의 순환신경망)

  • Jeong, Nak-U;Kim, Byeong-Gi
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.2
    • /
    • pp.401-408
    • /
    • 1997
  • Human gets almist all of his knoweledge from the recognition and the accumulation of input patterns,image or sound,the he gets theough his eyes and through his ears.Among these means,his chracter recognition,an ability that allows him to recognize characters and understand their meanings through visual information, is now applied to a pattern recognition system using neural network in computer. Recurrent neural network is one of those models that reuse the output value in neural network learning.Recently many studies try to apply this recurrent neural network to the classification of static patterns like off-line handwritten characters. But most of their efforts are not so drrdtive until now.This stusy suggests a new type of recurrent neural network for an deedctive classification of the static patterns such as off-line handwritten chracters.Using the new J-E(Jordan-Elman)neural network model that enlarges and combines Jordan Model and Elman Model,this new type is better than those of before in recobnizing the static patterms such as figures and handwritten-characters.

  • PDF

A Study on Speech Recognition using Recurrent Neural Networks (회귀신경망을 이용한 음성인식에 관한 연구)

  • 한학용;김주성;허강인
    • The Journal of the Acoustical Society of Korea
    • /
    • v.18 no.3
    • /
    • pp.62-67
    • /
    • 1999
  • In this paper, we investigates a reliable model of the Predictive Recurrent Neural Network for the speech recognition. Predictive Neural Networks are modeled by syllable units. For the given input syllable, then a model which gives the minimum prediction error is taken as the recognition result. The Predictive Neural Network which has the structure of recurrent network was composed to give the dynamic feature of the speech pattern into the network. We have compared with the recognition ability of the Recurrent Network proposed by Elman and Jordan. ETRI's SAMDORI has been used for the speech DB. In order to find a reliable model of neural networks, the changes of two recognition rates were compared one another in conditions of: (1) changing prediction order and the number of hidden units: and (2) accumulating previous values with self-loop coefficient in its context. The result shows that the optimum prediction order, the number of hidden units, and self-loop coefficient have differently responded according to the structure of neural network used. However, in general, the Jordan's recurrent network shows relatively higher recognition rate than Elman's. The effects of recognition rate on the self-loop coefficient were variable according to the structures of neural network and their values.

  • PDF

The Improving Method of Characters Recognition Using New Recurrent Neural Network (새로운 순환신경망을 사용한 문자인식성능의 향상 방안)

  • 정낙우;김병기
    • Journal of the Korea Society of Computer and Information
    • /
    • v.1 no.1
    • /
    • pp.129-138
    • /
    • 1996
  • In the result of Industrial development. largeness and highness of techniques. a large amount of Information Is being treated every year. Achive informationization. we must store in computer ,all informations written on paper for a long time and be able to utilize them In right time and place. There Is recurrent neural network as a model rousing the output value In learning neural network for characters recognition. But most of these methods are not so effectively applied to it. This study suggests a new type of recurrent neural network to classifyeffectively the static patterns such as off-line handwritten characters. This study shows that this new type Is better than those of before in recognizing the patterns. such as figures and handwritten characters, by using the new J-E (Jordan-Elman) neural network model in which enlarges and combines Jordan and Elman Model.

  • PDF

A New Recurrent Neural Network Architecture for Pattern Recognition and Its Convergence Results

  • Lee, Seong-Whan;Kim, Young-Joon;Song, Hee-Heon
    • Journal of Electrical Engineering and information Science
    • /
    • v.1 no.1
    • /
    • pp.108-117
    • /
    • 1996
  • In this paper, we propose a new type of recurrent neural network architecture in which each output unit is connected with itself and fully-connected with other output units and all hidden units. The proposed recurrent network differs from Jordan's and Elman's recurrent networks in view of functions and architectures because it was originally extended from the multilayer feedforward neural network for improving the discrimination and generalization power. We also prove the convergence property of learning algorithm of the proposed recurrent neural network and analyze the performance of the proposed recurrent neural network by performing recognition experiments with the totally unconstrained handwritten numeral database of Concordia University of Canada. Experimental results confirmed that the proposed recurrent neural network improves the discrimination and generalization power in recognizing spatial patterns.

  • PDF

A Study on the Recognition of Korean Numerals Using Recurrent Neural Predictive HMM (회귀신경망 예측 HMM을 이용한 숫자음 인식에 관한 연구)

  • 김수훈;고시영;허강인
    • The Journal of the Acoustical Society of Korea
    • /
    • v.20 no.8
    • /
    • pp.12-18
    • /
    • 2001
  • In this paper, we propose the Recurrent Neural Predictive HMM (RNPHMM). The RNPHMM is the hybrid network of the recurrent neural network and HMM. The predictive recurrent neural network trained to predict the future vector based on several last feature vectors, and defined every state of HMM. This method uses the prediction value from the predictive recurrent neural network, which is dynamically changing due to the effects of the previous feature vectors instead of the stable average vectors. The models of the RNPHMM are Elman network prediction HMM and Jordan network prediction HMM. In the experiment, we compared the recognition abilities of the RNPHMM as we increased the state number, prediction order, and number of hidden nodes for the isolated digits. As a result of the experiments, Elman network prediction HMM and Jordan network prediction HMM have good recognition ability as 98.5% for test data, respectively.

  • PDF

Modular Neural Network Using Recurrent Neural Network (궤환 신경회로망을 사용한 모듈라 네트워크)

  • 최우경;김성주;서재용;전흥태
    • Proceedings of the IEEK Conference
    • /
    • 2003.07d
    • /
    • pp.1565-1568
    • /
    • 2003
  • In this paper, we propose modular network to solve difficult and complex problems that are seldom solved with multi-layer neural network. The structure of modular neural network in researched by Jacobs and Jordan is selected in this paper. Modular network consists of several expert networks and a gating network which is composed of single-layer neural network or multi-layer neural network. We propose modular network structure using recurrent neural network, since the state of the whole network at a particular time depends on an aggregate of previous states as well as on the current input. Finally, we show excellence of the proposed network compared with modular network.

  • PDF

Recurrent Based Modular Neural Network

  • Yon, Jung-Heum;Park, Woo-Kyung;Kim, Yong-Min;Jeon, Hong-Tae
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.694-697
    • /
    • 2003
  • In this paper, we propose modular network to solve difficult and complex problems that are seldom solved with Multi-Layer Neural Network(MLNN). The structure of Modular Neural Network(MNN) in researched by Jacobs and jordan is selected in this paper. Modular network consists of several Expert Networks(EN) and a Gating Network(CN) which is composed of single-layer neural network(SLNN) or multi-layer neural network. We propose modular network structure using Recurrent Neural Network(RNN), since the state of the whole network at a particular time depends on aggregate of previous states as well as on the current input. Finally, we show excellence of the proposed network compared with modular network.

  • PDF

Control of Chaos Dynamics in Jordan Recurrent Neural Networks

  • Jin, Sang-Ho;Kenichi, Abe
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.43.1-43
    • /
    • 2001
  • We propose two control methods of the Lyapunov exponents for Jordan-type recurrent neural networks. Both the two methods are formulated by a gradient-based learning method. The first method is derived strictly from the definition of the Lyapunov exponents that are represented by the state transition of the recurrent networks. The first method can control the complete set of the exponents, called the Lyapunov spectrum, however, it is computationally expensive because of its inherent recursive way to calculate the changes of the network parameters. Also this recursive calculation causes an unstable control when, at least, one of the exponents is positive, such as the largest Lyapunov exponent in the recurrent networks with chaotic dynamics. To improve stability in the chaotic situation, we propose a non recursive formulation by approximating ...

  • PDF

Learning for Environment and Behavior Pattern Using Recurrent Modular Neural Network Based on Estimated Emotion (감정평가에 기반한 환경과 행동패턴 학습을 위한 궤환 모듈라 네트워크)

  • Kim, Seong-Joo;Choi, Woo-Kyung;Kim, Yong-Min;Jeon, Hong-Tae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.1
    • /
    • pp.9-14
    • /
    • 2004
  • Rational sense is affected by emotion. If we add the factor of estimated emotion by environment information into robots, we may get more intelligent and human-friendly robots. However, various sensory information and pattern classification are prescribed for robots to learn emotion so that the networks are suitable for the necessity of robots. Neural network has superior ability to extract character of system but neural network has defect of temporal cross talk and local minimum convergence. To solve the defects, many kinds of modular neural networks have been proposed because they divide a complex problem into simple several subproblems. The modular neural network, introduced by Jacobs and Jordan, shows an excellent ability of recomposition and recombination of complex work. On the other hand, the recurrent network acquires state representations and representations of state make the recurrent neural network suitable for diverse applications such as nonlinear prediction and modeling. In this paper, we applied recurrent network for the expert network in the modular neural network structure to learn data pattern based on emotional assessment. To show the performance of the proposed network, simulation of learning the environment and behavior pattern is proceeded with the real time implementation. The given problem is very complex and has too many cases to learn. The result will show the performance and good ability of the proposed network and will be compared with the result of other method, general modular neural network.

A study on the spoken digit recognition performance of the Two-Stage recurrent neural network (2단 회귀신경망의 숫자음 인식에관한 연구)

  • 안점영
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.25 no.3B
    • /
    • pp.565-569
    • /
    • 2000
  • We compose the two-stage recurrent neural network that returns both signals of a hidden and an output layer to the hidden layer. It is tested on the basis of syllables for Korean spoken digit from /gong/to /gu. For these experiments, we adjust the neuron number of the hidden layer, the predictive order of input data and self-recurrent coefficient of the decision state layer. By the experimental results, the recognition rate of this neural network is between 91% and 97.5% in the speaker-dependent case and between 80.75% and 92% in the speaker-independent case. In the speaker-dependent case, this network shows an equivalent recognition performance to Jordan and Elman network but in the speaker-independent case, it does improved performance.

  • PDF