• Title/Summary/Keyword: Error Backpropagation Learning Algorithm

Search Result 60, Processing Time 0.044 seconds

Improved Error Backpropagation Algorithm using Modified Activation Function Derivative (수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선)

  • 권희용;황희영
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.41 no.3
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

A new learning algorithm for multilayer neural networks (새로운 다층 신경망 학습 알고리즘)

  • 고진욱;이철희
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.1285-1288
    • /
    • 1998
  • In this paper, we propose a new learning algorithm for multilayer neural networks. In the error backpropagation that is widely used for training multilayer neural networks, weights are adjusted to reduce the error function that is sum of squared error for all the neurons in the output layer of the network. In the proposed learning algorithm, we consider each output of the output layer as a function of weights and adjust the weights directly so that the output neurons produce the desired outputs. Experiments show that the proposed algorithm outperforms the backpropagation learning algorithm.

  • PDF

Improved Error Backpropagation by Elastic Learning Rate and Online Update (가변학습율과 온라인모드를 이용한 개선된 EBP 알고리즘)

  • Lee, Tae-Seung;Park, Ho-Jin
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.04b
    • /
    • pp.568-570
    • /
    • 2004
  • The error-backpropagation (EBP) algerithm for training multilayer perceptrons (MLPs) is known to have good features of robustness and economical efficiency. However, the algorithm has difficulty in selecting an optimal constant learning rate and thus results in non-optimal learning speed and inflexible operation for working data. This paper Introduces an elastic learning rate that guarantees convergence of learning and its local realization by online upoate of MLP parameters Into the original EBP algorithm in order to complement the non-optimality. The results of experiments on a speaker verification system with Korean speech database are presented and discussed to demonstrate the performance improvement of the proposed method in terms of learning speed and flexibility fer working data of the original EBP algorithm.

  • PDF

A study on time-varying control of learning parameters in neural networks (신경망 학습 변수의 시변 제어에 관한 연구)

  • 박종철;원상철;최한고
    • Proceedings of the Korea Institute of Convergence Signal Processing
    • /
    • 2000.12a
    • /
    • pp.201-204
    • /
    • 2000
  • This paper describes a study on the time-varying control of parameters in learning of the neural network. Elman recurrent neural network (RNN) is used to implement the control of parameters. The parameters of learning and momentum rates In the error backpropagation algorithm ate updated at every iteration using fuzzy rules based on performance index. In addition, the gain and slope of the neuron's activation function are also considered time-varying parameters. These function parameters are updated using the gradient descent algorithm. Simulation results show that the auto-tuned learning algorithm results in faster convergence and lower system error than regular backpropagation in the system identification.

  • PDF

A Method on the Learning Speed Improvement of the Online Error Backpropagation Algorithm in Speech Processing (음성처리에서 온라인 오류역전파 알고리즘의 학습속도 향상방법)

  • 이태승;이백영;황병원
    • The Journal of the Acoustical Society of Korea
    • /
    • v.21 no.5
    • /
    • pp.430-437
    • /
    • 2002
  • Having a variety of good characteristics against other pattern recognition techniques, the multilayer perceptron (MLP) has been widely used in speech recognition and speaker recognition. But, it is known that the error backpropagation (EBP) algorithm that MLP uses in learning has the defect that requires restricts long learning time, and it restricts severely the applications like speaker recognition and speaker adaptation requiring real time processing. Because the learning data for pattern recognition contain high redundancy, in order to increase the learning speed it is very effective to use the online-based learning methods, which update the weight vector of the MLP by the pattern. A typical online EBP algorithm applies the fixed learning rate for each update of the weight vector. Though a large amount of speedup with the online EBP can be obtained by choosing the appropriate fixed rate, firing the rate leads to the problem that the algorithm cannot respond effectively to different learning phases as the phases change and the number of patterns contributing to learning decreases. To solve this problem, this paper proposes a Changing rate and Omitting patterns in Instant Learning (COIL) method to apply the variable rate and the only patterns necessary to the learning phase when the phases come to change. In this paper, experimentations are conducted for speaker verification and speech recognition, and results are presented to verify the performance of the COIL.

Constructing Neural Networks Using Genetic Algorithm and Learning Neural Networks Using Various Learning Algorithms (유전알고리즘을 이용한 신경망의 구성 및 다양한 학습 알고리즘을 이용한 신경망의 학습)

  • 양영순;한상민
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1998.04a
    • /
    • pp.216-225
    • /
    • 1998
  • Although artificial neural network based on backpropagation algorithm is an excellent system simulator, it has still unsolved problems of its structure-decision and learning method. That is, we cannot find a general approach to decide the structure of the neural network and cannot train it satisfactorily because of the local optimum point which it frequently falls into. In addition, although there are many successful applications using backpropagation learning algorithm, there are few efforts to improve the learning algorithm itself. In this study, we suggest a general way to construct the hidden layer of the neural network using binary genetic algorithm and also propose the various learning methods by which the global minimum value of the teaming error can be obtained. A XOR problem and line heating problems are investigated as examples.

  • PDF

Fuzzy Supervised Learning Algorithm by using Self-generation (Self-generation을 이용한 퍼지 지도 학습 알고리즘)

  • 김광백
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.7
    • /
    • pp.1312-1320
    • /
    • 2003
  • In this paper, we consider a multilayer neural network, with a single hidden layer. Error backpropagation learning method used widely in multilayer neural networks has a possibility of local minima due to the inadequate weights and the insufficient number of hidden nodes. So we propose a fuzzy supervised learning algorithm by using self-generation that self-generates hidden nodes by the compound fuzzy single layer perceptron and modified ART1. From the input layer to hidden layer, a modified ART1 is used to produce nodes. And winner take-all method is adopted to the connection weight adaptation, so that a stored pattern for some pattern gets updated. The proposed method has applied to the student identification card images. In simulation results, the proposed method reduces a possibility of local minima and improves learning speed and paralysis than the conventional error backpropagation learning algorithm.

  • PDF

Adaptive Blocking Artifacts Reduction Algorithm in Block Boundary Area Using Error Backpropagation Learning Algorithm (오류 역전파 학습 알고리듬을 이용한 블록경계 영역에서의 적응적 블록화 현상 제거 알고리듬)

  • 권기구;이종원;권성근;반성원;박경남;이건일
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.9B
    • /
    • pp.1292-1298
    • /
    • 2001
  • 본 논문에서는 공간 영역에서의 블록 분류 (block classification)와 순방향 신경망 필터(feedforward neural network filter)를 이용한 블록 기반 부호화에서의 적응적 블록화 현상 제거 알고리듬을 제안하였다. 제안한 방법에서는 각 블록 경계를 인접 블록간의 통계적 특성을 이용하여 평탄 영역과 에지 영역으로 분류한 후, 각 영역에 대하여 블록화 현상이 발생하였다고 분류된 클래스에 대하여 적응적인 블록간 필터링을 수행한다. 즉, 평탄 영역으로 분류된 영역 중 블록화 현상이 발생한 영역은 오류 역전파 학습 알고리듬 (error backpropagation learning algorithm)에 의하여 학습된 2계층 (2-layer) 신경망 필터를 이용하여 블록화 현상을 제거하고, 복잡한 영역으로 분류된 영역 중 블록화 현상이 발생한 영역은 에지 성분을 보존하기 위하여 선형 내삽을 이용하여 블록간 인접 화소의 밝기 값만을 조정함으로써 블록화 현상을 제거한다. 모의 실험 결과를 통하여 제안한 방법이 객관적 화질 및 주관적 화질 측면에서 기존의 방법보다 그 성능이 우수함을 확인하였다.

  • PDF

Recurrent Neural Network with Backpropagation Through Time Learning Algorithm for Arabic Phoneme Recognition

  • Ismail, Saliza;Ahmad, Abdul Manan
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1033-1036
    • /
    • 2004
  • The study on speech recognition and understanding has been done for many years. In this paper, we propose a new type of recurrent neural network architecture for speech recognition, in which each output unit is connected to itself and is also fully connected to other output units and all hidden units [1]. Besides that, we also proposed the new architecture and the learning algorithm of recurrent neural network such as Backpropagation Through Time (BPTT, which well-suited. The aim of the study was to observe the difference of Arabic's alphabet like "alif" until "ya". The purpose of this research is to upgrade the people's knowledge and understanding on Arabic's alphabet or word by using Recurrent Neural Network (RNN) and Backpropagation Through Time (BPTT) learning algorithm. 4 speakers (a mixture of male and female) are trained in quiet environment. Neural network is well-known as a technique that has the ability to classified nonlinear problem. Today, lots of researches have been done in applying Neural Network towards the solution of speech recognition [2] such as Arabic. The Arabic language offers a number of challenges for speech recognition [3]. Even through positive results have been obtained from the continuous study, research on minimizing the error rate is still gaining lots attention. This research utilizes Recurrent Neural Network, one of Neural Network technique to observe the difference of alphabet "alif" until "ya".

  • PDF

A Fast-Loaming Algorithm for MLP in Pattern Recognition (패턴인식의 MLP 고속학습 알고리즘)

  • Lee, Tae-Seung;Choi, Ho-Jin
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.8 no.3
    • /
    • pp.344-355
    • /
    • 2002
  • Having a variety of good characteristics against other pattern recognition techniques, Multilayer Perceptron (MLP) has been used in wide applications. But, it is known that Error Backpropagation (EBP) algorithm which MLP uses in learning has a defect that requires relatively long leaning time. Because learning data in pattern recognition contain abundant redundancies, in order to increase learning speed it is very effective to use online-based teaming methods, which update parameters of MLP pattern by pattern. Typical online EBP algorithm applies fixed learning rate for each update of parameters. Though a large amount of speedup with online EBP can be obtained by choosing an appropriate fixed rate, fixing the rate leads to the problem that the algorithm cannot respond effectively to different leaning phases as the phases change and the learning pattern areas vary. To solve this problem, this paper defines learning as three phases and proposes a Instant Learning by Varying Rate and Skipping (ILVRS) method to reflect only necessary patterns when learning phases change. The basic concept of ILVRS is as follows. To discriminate and use necessary patterns which change as learning proceeds, (1) ILVRS uses a variable learning rate which is an error calculated from each pattern and is suppressed within a proper range, and (2) ILVRS bypasses unnecessary patterns in loaming phases. In this paper, an experimentation is conducted for speaker verification as an application of pattern recognition, and the results are presented to verify the performance of ILVRS.