Speeding-up for error back-propagation algorithm using micro-genetic algorithms

미소-유전 알고리듬을 이용한 오류 역전파 알고리듬의 학습 속도 개선 방법

  • Published : 1993.10.01

Abstract

The error back-propagation(BP) algorithm is widely used for finding optimum weights of multi-layer neural networks. However, the critical drawback of the BP algorithm is its slow convergence of error. The major reason for this slow convergence is the premature saturation which is a phenomenon that the error of a neural network stays almost constant for some period time during learning. An inappropriate selections of initial weights cause each neuron to be trapped in the premature saturation state, which brings in slow convergence speed of the multi-layer neural network. In this paper, to overcome the above problem, Micro-Genetic algorithms(.mu.-GAs) which can allow to find the near-optimal values, are used to select the proper weights and slopes of activation function of neurons. The effectiveness of the proposed algorithms will be demonstrated by some computer simulations of two d.o.f planar robot manipulator.

Keywords