• Title/Summary/Keyword: error backpropagation algorithm

Search Result 88, Processing Time 0.025 seconds

Improved Error Backpropagation Algorithm using Modified Activation Function Derivative (수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선)

  • 권희용;황희영
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.41 no.3
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

A new learning algorithm for multilayer neural networks (새로운 다층 신경망 학습 알고리즘)

  • 고진욱;이철희
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.1285-1288
    • /
    • 1998
  • In this paper, we propose a new learning algorithm for multilayer neural networks. In the error backpropagation that is widely used for training multilayer neural networks, weights are adjusted to reduce the error function that is sum of squared error for all the neurons in the output layer of the network. In the proposed learning algorithm, we consider each output of the output layer as a function of weights and adjust the weights directly so that the output neurons produce the desired outputs. Experiments show that the proposed algorithm outperforms the backpropagation learning algorithm.

  • PDF

Adaptive Error Constrained Backpropagation Algorithm (적응 오류 제약 Backpropagation 알고리즘)

  • 최수용;고균병;홍대식
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.10C
    • /
    • pp.1007-1012
    • /
    • 2003
  • In order to accelerate the convergence speed of the conventional BP algorithm, constrained optimization techniques are applied to the BP algorithm. First, the noise-constrained least mean square algorithm and the zero noise-constrained LMS algorithm are applied (designated the NCBP and ZNCBP algorithms, respectively). These methods involve an important assumption: the filter or the receiver in the NCBP algorithm must know the noise variance. By means of extension and generalization of these algorithms, the authors derive an adaptive error-constrained BP algorithm, in which the error variance is estimated. This is achieved by modifying the error function of the conventional BP algorithm using Lagrangian multipliers. The convergence speeds of the proposed algorithms are 20 to 30 times faster than those of the conventional BP algorithm, and are faster than or almost the same as that achieved with a conventional linear adaptive filter using an LMS algorithm.

Forecasting algorithm using an improved genetic algorithm based on backpropagation neural network model (개선된 유전자 역전파 신경망에 기반한 예측 알고리즘)

  • Yoon, YeoChang;Jo, Na Rae;Lee, Sung Duck
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.6
    • /
    • pp.1327-1336
    • /
    • 2017
  • In this study, the problems in the short term stock market forecasting are analyzed and the feasibility of the ARIMA method and the backpropagation neural network is discussed. Neural network and genetic algorithm in short term stock forecasting is also examined. Since the backpropagation algorithm often falls into the local minima trap, we optimized the backpropagation neural network and established a genetic algorithm based on backpropagation neural network for forecasting model in order to achieve high forecasting accuracy. The experiments adopted the korea composite stock price index series to make prediction and provided corresponding error analysis. The results show that the genetic algorithm based on backpropagation neural network model proposed in this study has a significant improvement in stock price index series forecasting accuracy.

Rejection of Interference Signal Using Neural Network in Multi-path Channel Systems (다중 경로 채널 시스템에서 신경회로망을 이용한 간섭 신호 제거)

  • 석경휴
    • Proceedings of the Acoustical Society of Korea Conference
    • /
    • 1998.06c
    • /
    • pp.357-360
    • /
    • 1998
  • DS/CDMA system rejected narrow-band interference and additional White Gaussian noise which are occured at multipath, intentional jammer and multiuser to share same bandwidth in mobile communication systems. Because of having not sufficiently obtained processing gain which is related to system performance, they were not effectively suppressed. In this paper, an matched filter channel model using backpropagation neural network based on complex multilayer perceptron is presented for suppressing interference of narrow-band of direct sequence spread spectrum receiver in DS/CDMA mobile communication systems. Recursive least square backpropagation algorithm with backpropagation error is used for fast convergence and better performance in matched filter receiver scheme. According to signal noise ratio and transmission power ratio, computer simulation results show that bit error ratio of matched filter using backpropagation neural network improved than that of RAKE receiver of direct sequence spread spectrum considering of con-channel and narrow-band interference.

  • PDF

An Effective Mapping for a Mobile Robot using Error Backpropagation based Sensor Fusion (오류 역전파 신경망 기반의 센서융합을 이용한 이동로봇의 효율적인 지도 작성)

  • Kim, Kyoung-Dong;Qu, Xiao-Chuan;Choi, Kyung-Sik;Lee, Suk-Gyu
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.9
    • /
    • pp.1040-1047
    • /
    • 2011
  • This paper proposes a novel method based on error back propagation neural networks to fuse laser sensor data and ultrasonic sensor data for enhancing the accuracy of mapping. For navigation of single robot, the robot has to know its initial position and accurate environment information around it. However, due to the inherent properties of sensors, each sensor has its own advantages and drawbacks. In our system, the robot equipped with seven ultrasonic sensors and a laser sensor navigates to map two different corridor environments. The experimental results show the effectiveness of the heterogeneous sensor fusion using an error backpropagation algorithm for mapping.

Adaptive Blocking Artifacts Reduction Algorithm in Block Boundary Area Using Error Backpropagation Learning Algorithm (오류 역전파 학습 알고리듬을 이용한 블록경계 영역에서의 적응적 블록화 현상 제거 알고리듬)

  • 권기구;이종원;권성근;반성원;박경남;이건일
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.9B
    • /
    • pp.1292-1298
    • /
    • 2001
  • 본 논문에서는 공간 영역에서의 블록 분류 (block classification)와 순방향 신경망 필터(feedforward neural network filter)를 이용한 블록 기반 부호화에서의 적응적 블록화 현상 제거 알고리듬을 제안하였다. 제안한 방법에서는 각 블록 경계를 인접 블록간의 통계적 특성을 이용하여 평탄 영역과 에지 영역으로 분류한 후, 각 영역에 대하여 블록화 현상이 발생하였다고 분류된 클래스에 대하여 적응적인 블록간 필터링을 수행한다. 즉, 평탄 영역으로 분류된 영역 중 블록화 현상이 발생한 영역은 오류 역전파 학습 알고리듬 (error backpropagation learning algorithm)에 의하여 학습된 2계층 (2-layer) 신경망 필터를 이용하여 블록화 현상을 제거하고, 복잡한 영역으로 분류된 영역 중 블록화 현상이 발생한 영역은 에지 성분을 보존하기 위하여 선형 내삽을 이용하여 블록간 인접 화소의 밝기 값만을 조정함으로써 블록화 현상을 제거한다. 모의 실험 결과를 통하여 제안한 방법이 객관적 화질 및 주관적 화질 측면에서 기존의 방법보다 그 성능이 우수함을 확인하였다.

  • PDF

A study on time-varying control of learning parameters in neural networks (신경망 학습 변수의 시변 제어에 관한 연구)

  • 박종철;원상철;최한고
    • Proceedings of the Korea Institute of Convergence Signal Processing
    • /
    • 2000.12a
    • /
    • pp.201-204
    • /
    • 2000
  • This paper describes a study on the time-varying control of parameters in learning of the neural network. Elman recurrent neural network (RNN) is used to implement the control of parameters. The parameters of learning and momentum rates In the error backpropagation algorithm ate updated at every iteration using fuzzy rules based on performance index. In addition, the gain and slope of the neuron's activation function are also considered time-varying parameters. These function parameters are updated using the gradient descent algorithm. Simulation results show that the auto-tuned learning algorithm results in faster convergence and lower system error than regular backpropagation in the system identification.

  • PDF

Improved Error Backpropagation by Elastic Learning Rate and Online Update (가변학습율과 온라인모드를 이용한 개선된 EBP 알고리즘)

  • Lee, Tae-Seung;Park, Ho-Jin
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.04b
    • /
    • pp.568-570
    • /
    • 2004
  • The error-backpropagation (EBP) algerithm for training multilayer perceptrons (MLPs) is known to have good features of robustness and economical efficiency. However, the algorithm has difficulty in selecting an optimal constant learning rate and thus results in non-optimal learning speed and inflexible operation for working data. This paper Introduces an elastic learning rate that guarantees convergence of learning and its local realization by online upoate of MLP parameters Into the original EBP algorithm in order to complement the non-optimality. The results of experiments on a speaker verification system with Korean speech database are presented and discussed to demonstrate the performance improvement of the proposed method in terms of learning speed and flexibility fer working data of the original EBP algorithm.

  • PDF

Recurrent Neural Network with Backpropagation Through Time Learning Algorithm for Arabic Phoneme Recognition

  • Ismail, Saliza;Ahmad, Abdul Manan
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1033-1036
    • /
    • 2004
  • The study on speech recognition and understanding has been done for many years. In this paper, we propose a new type of recurrent neural network architecture for speech recognition, in which each output unit is connected to itself and is also fully connected to other output units and all hidden units [1]. Besides that, we also proposed the new architecture and the learning algorithm of recurrent neural network such as Backpropagation Through Time (BPTT, which well-suited. The aim of the study was to observe the difference of Arabic's alphabet like "alif" until "ya". The purpose of this research is to upgrade the people's knowledge and understanding on Arabic's alphabet or word by using Recurrent Neural Network (RNN) and Backpropagation Through Time (BPTT) learning algorithm. 4 speakers (a mixture of male and female) are trained in quiet environment. Neural network is well-known as a technique that has the ability to classified nonlinear problem. Today, lots of researches have been done in applying Neural Network towards the solution of speech recognition [2] such as Arabic. The Arabic language offers a number of challenges for speech recognition [3]. Even through positive results have been obtained from the continuous study, research on minimizing the error rate is still gaining lots attention. This research utilizes Recurrent Neural Network, one of Neural Network technique to observe the difference of alphabet "alif" until "ya".

  • PDF