• Title/Summary/Keyword: Backpropagation

Search Result 591, Processing Time 0.037 seconds

Improve Digit Recognition Capability of Backpropagation Neural Networks by Enhancing Image Preprocessing Technique

  • Feng, Xiongfeng;Kubik, K.Bogunia
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.49.4-49
    • /
    • 2001
  • Digit recognition based on backpropagation neural networks, as an important application of pattern recognition, was attracted much attention. Although it has the advantages of parallel calculation, high error-tolerance, and learning capability, better recognition effects can only be achieved with some specific fixed format input of the digit image. Therefore, digit image preprocessing ability directly affects the accuracy of recognition. Here using Matlab software, the digit image was enhanced by resizing and neutral-rotating the extracted digit image, which improved the digit recognition capability of the backpropagation neural network under practical conditions. This method may also be helpful for recognition of other patterns with backpropagation neural networks.

  • PDF

Improving the Training Performance of Neural Networks by using Hybrid Algorithm (하이브리드 알고리즘을 이용한 신경망의 학습성능 개선)

  • Kim, Weon-Ook;Cho, Yong-Hyun;Kim, Young-Il;Kang, In-Ku
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.11
    • /
    • pp.2769-2779
    • /
    • 1997
  • This Paper Proposes an efficient method for improving the training performance of the neural networks using a hybrid of conjugate gradient backpropagation algorithm and dynamic tunneling backpropagation algorithm The conjugate gradient backpropagation algorithm, which is the fast gradient algorithm, is applied for high speed optimization. The dynamic tunneling backpropagation algorithm, which is the deterministic method with tunneling phenomenon, is applied for global optimization. Conversing to the local minima by using the conjugate gradient backpropagation algorithm, the new initial point for escaping the local minima is estimated by dynamic tunneling backpropagation algorithm. The proposed method has been applied to the parity check and the pattern classification. The simulation results show that the performance of proposed method is superior to those of gradient descent backpropagtion algorithm and a hybrid of gradient descent and dynamic tunneling backpropagation algorithm, and the new algorithm converges more often to the global minima than gradient descent backpropagation algorithm.

  • PDF

Performance Improvement of Backpropagation Algorithm by Automatic Tuning of Learning Rate using Fuzzy Logic System

  • Jung, Kyung-Kwon;Lim, Joong-Kyu;Chung, Sung-Boo;Eom, Ki-Hwan
    • Journal of information and communication convergence engineering
    • /
    • v.1 no.3
    • /
    • pp.157-162
    • /
    • 2003
  • We propose a learning method for improving the performance of the backpropagation algorithm. The proposed method is using a fuzzy logic system for automatic tuning of the learning rate of each weight. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust the learning rate. The inputs of fuzzy logic system are delta and delta bar, and the output of fuzzy logic system is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on the XOR problem, character classification, and function approximation. The results show that the proposed method considerably improves the performance compared to the general backpropagation, the backpropagation with momentum, and the Jacobs'delta-bar-delta algorithm.

A new training method of multilayer neural networks using a hybrid of backpropagation algorithm and dynamic tunneling system (후향전파 알고리즘과 동적터널링 시스템을 조합한 다층신경망의 새로운 학습방법)

  • 조용현
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.4
    • /
    • pp.201-208
    • /
    • 1996
  • This paper proposes an efficient method for improving the training performance of the neural network using a hybrid of backpropagation algorithm and dynamic tunneling system.The backpropagation algorithm, which is the fast gradient descent method, is applied for high-speed optimization. The dynamic tunneling system, which is the deterministic method iwth a tunneling phenomenone, is applied for blobal optimization. Converging to the local minima by using the backpropagation algorithm, the approximate initial point for escaping the local minima is estimated by the pattern classification, and the simulation results show that the performance of proposed method is superior th that of backpropagation algorithm with randomized initial point settings.

  • PDF

Recognition of vehicle number plate using multi backpropagation neural network (다중 역전파 신경망을 이용한 차량 번호판의 인식)

  • 최재호;조범준
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.22 no.11
    • /
    • pp.2432-2438
    • /
    • 1997
  • This paper proposes recognition system using multi-backpropagation neural networks rather than single backpropagation neural network to enhance the rate of character recognition resultsing from extracting the region of velhicle number in that the image of vehicle number plate from CCD camera has a distinguish feature, that is, illumination of a pattern. The experiment in this paper shows an output that the method using multi-backpropagation neural networks rather than signal backpropagation neural network takes less training time for computation and also has higher recognition rage of vehicle number.

  • PDF

Improved Error Backpropagation Algorithm using Modified Activation Function Derivative (수정된 Activation Function Derivative를 이용한 오류 역전파 알고리즘의 개선)

  • 권희용;황희영
    • The Transactions of the Korean Institute of Electrical Engineers
    • /
    • v.41 no.3
    • /
    • pp.274-280
    • /
    • 1992
  • In this paper, an Improved Error Back Propagation Algorithm is introduced, which avoids Network Paralysis, one of the problems of the Error Backpropagation learning rule. For this purpose, we analyzed the reason for Network Paralysis and modified the Activation Function Derivative of the standard Error Backpropagation Algorithm which is regarded as the cause of the phenomenon. The characteristics of the modified Activation Function Derivative is analyzed. The performance of the modified Error Backpropagation Algorithm is shown to be better than that of the standard Error Back Propagation algorithm by various experiments.

  • PDF

Improving the Training Performance of Multilayer Neural Network by Using Stochastic Approximation and Backpropagation Algorithm (확률적 근사법과 후형질과 알고리즘을 이용한 다층 신경망의 학습성능 개선)

  • 조용현;최흥문
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.31B no.4
    • /
    • pp.145-154
    • /
    • 1994
  • This paper proposes an efficient method for improving the training performance of the neural network by using a hybrid of a stochastic approximation and a backpropagation algorithm. The proposed method improves the performance of the training by appliying a global optimization method which is a hybrid of a stochastic approximation and a backpropagation algorithm. The approximate initial point for a stochastic approximation and a backpropagation algorihtm. The approximate initial point for fast global optimization is estimated first by applying the stochastic approximation, and then the backpropagation algorithm, which is the fast gradient descent method, is applied for a high speed global optimization. And further speed-up of training is made possible by adjusting the training parameters of each of the output and the hidden layer adaptively to the standard deviation of the neuron output of each layer. The proposed method has been applied to the parity checking and the pattern classification, and the simulation results show that the performance of the proposed method is superior to that of the backpropagation, the Baba's MROM, and the Sun's method with randomized initial point settings. The results of adaptive adjusting of the training parameters show that the proposed method further improves the convergence speed about 20% in training.

  • PDF

Forecasting algorithm using an improved genetic algorithm based on backpropagation neural network model (개선된 유전자 역전파 신경망에 기반한 예측 알고리즘)

  • Yoon, YeoChang;Jo, Na Rae;Lee, Sung Duck
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.6
    • /
    • pp.1327-1336
    • /
    • 2017
  • In this study, the problems in the short term stock market forecasting are analyzed and the feasibility of the ARIMA method and the backpropagation neural network is discussed. Neural network and genetic algorithm in short term stock forecasting is also examined. Since the backpropagation algorithm often falls into the local minima trap, we optimized the backpropagation neural network and established a genetic algorithm based on backpropagation neural network for forecasting model in order to achieve high forecasting accuracy. The experiments adopted the korea composite stock price index series to make prediction and provided corresponding error analysis. The results show that the genetic algorithm based on backpropagation neural network model proposed in this study has a significant improvement in stock price index series forecasting accuracy.

Auto-Tuning Method of Learning Rate for Performance Improvement of Backpropagation Algorithm (역전파 알고리즘의 성능개선을 위한 학습율 자동 조정 방식)

  • Kim, Joo-Woong;Jung, Kyung-Kwon;Eom, Ki-Hwan
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.39 no.4
    • /
    • pp.19-27
    • /
    • 2002
  • We proposed an auto-tuning method of learning rate for performance improvement of backpropagation algorithm. Proposed method is used a fuzzy logic system for automatic tuning of learning rate. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust learning rate. The inputs of fuzzy logic system are ${\Delta}$ and $\bar{{\Delta}}$, and the output is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on a N-parity problem, function approximation, and Arabic numerals classification. The results show that the proposed method has considerably improved the performance compared to the backpropagation, the backpropagation with momentum, and the Jacobs' delta-bar-delta.

Rejection of Interference Signal Using Neural Network in Multi-path Channel Systems (다중 경로 채널 시스템에서 신경회로망을 이용한 간섭 신호 제거)

  • 석경휴
    • Proceedings of the Acoustical Society of Korea Conference
    • /
    • 1998.06c
    • /
    • pp.357-360
    • /
    • 1998
  • DS/CDMA system rejected narrow-band interference and additional White Gaussian noise which are occured at multipath, intentional jammer and multiuser to share same bandwidth in mobile communication systems. Because of having not sufficiently obtained processing gain which is related to system performance, they were not effectively suppressed. In this paper, an matched filter channel model using backpropagation neural network based on complex multilayer perceptron is presented for suppressing interference of narrow-band of direct sequence spread spectrum receiver in DS/CDMA mobile communication systems. Recursive least square backpropagation algorithm with backpropagation error is used for fast convergence and better performance in matched filter receiver scheme. According to signal noise ratio and transmission power ratio, computer simulation results show that bit error ratio of matched filter using backpropagation neural network improved than that of RAKE receiver of direct sequence spread spectrum considering of con-channel and narrow-band interference.

  • PDF