A Simple Approach of Improving Back-Propagation Algorithm

  • Zhu, H. (Department of Computer Science, Faculty of Engineering, Hiroshima Kokusai Gakuin University) ;
  • Eguchi, K. (Kumamoto National College of Technology) ;
  • Tabata, T. (Kumamoto National College of Technology) ;
  • Sun, N. (Kumamoto National College of Technology)
  • 발행 : 2000.07.01

초록

The enhancement to the back-propagation algorithm presented in this paper has resulted from the need to extract sparsely connected networks from networks employing product terms. The enhancement works in conjunction with the back-propagation weight update process, so that the actions of weight zeroing and weight stimulation enhance each other. It is shown that the error measure, can also be interpreted as rate of weight change (as opposed to ${\Delta}W_{ij}$), and consequently used to determine when weights have reached a stable state. Weights judged to be stable are then compared to a zero weight threshold. Should they fall below this threshold, then the weight in question is zeroed. Simulation of such a system is shown to return improved learning rates and reduce network connection requirements, with respect to the optimal network solution, trained using the normal back-propagation algorithm for Multi-Layer Perceptron (MLP), Higher Order Neural Network (HONN) and Sigma-Pi networks.

키워드