Performance Improvement of Backpropagation Algorithm by Automatic Tuning of Learning Rate using Fuzzy Logic System

  • Jung, Kyung-Kwon (Department of Electronic Engineering, Dongguk University) ;
  • Lim, Joong-Kyu (Department of Electronic Engineering, Dongguk University) ;
  • Chung, Sung-Boo (Department of Electronic Engineering, Seoil Colleg) ;
  • Eom, Ki-Hwan (Department of Electronic Engineering, Dongguk University)
  • Published : 2003.09.01

Abstract

We propose a learning method for improving the performance of the backpropagation algorithm. The proposed method is using a fuzzy logic system for automatic tuning of the learning rate of each weight. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust the learning rate. The inputs of fuzzy logic system are delta and delta bar, and the output of fuzzy logic system is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on the XOR problem, character classification, and function approximation. The results show that the proposed method considerably improves the performance compared to the general backpropagation, the backpropagation with momentum, and the Jacobs'delta-bar-delta algorithm.

Keywords

References

  1. Martin T. Hagan, Howard B. Demuth, Mark Beale, Neural Network Design, PWS Publishing, Boston, 1995
  2. Peiman G. Maghami and Dean W. Sparks, 'Design of Neural Networks for Fast Convergence and Accuracy: Dynamics and Control', IEEE Transactions on Neural Networks, vol. 11, no. 11, pp. 113-123, 2000 https://doi.org/10.1109/72.822515
  3. Tokumitsu Fujita, Takao Watanebe and Keiichiro Yasuda, 'A Study on Improvement in Learning Efficiency of Multilayered Neural Networks based on Dynamical System,' T.IEE Japan, vol. 117-C, no. 12, pp. 1848-1855, 1997
  4. C. Charalambous, 'Conjugate gradient algorithm for efficient training of artificial neural networks,' lEE Proceedings, vol. 139, no. 3, pp. 301-310, 1992
  5. M. T. Hagan and M. Menhaj, 'Training feedforward networks with the Marquardt algorithm,' IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 187-199, 1994 https://doi.org/10.1109/72.329697
  6. H. Y. Y. Sanossian and D. J. Evans, 'Gradient Range-Based Heuristic Method for Accelerating Neural Network Convergence,' Integrated ComputerAided Engineering, vol. 2, pp. 147-152, 1995
  7. R. A. Jacobs, 'Increased rates of convergence through learning rate adaptation,' Neural Networks, vol. 1, no. 4, pp. 295-308, 1988 https://doi.org/10.1016/0893-6080(88)90003-2
  8. T.P. Vogl, J.K. Mangis, A.K. Zigler, W.T.zink and D.L.Alkon, 'Accelerating the convergence of the backpropagation method', Biological Cybernetics., vol. 59, pp. 256-264, 1988
  9. W. Pedrycz, Fuzzy Control and Fuzzy Systems, John Wiley & Sons, Inc., 1992
  10. Laurene Fausett, Fundamentals of Neural Networks, Prentice Hall, 1994
  11. Howard Hemuth, Mark Beale, Neural Network Toolbox User's Guide, The Math Works, 1994