Browse > Article

Comparative Analysis on Error Back Propagation Learning and Layer By Layer Learning in Multi Layer Perceptrons  

곽영태 (익산대학 컴퓨터과학과)
Abstract
This paper surveys the EBP(Error Back Propagation) learning, the Cross Entropy function and the LBL(Layer By Layer) learning, which are used for learning the MLP(Multi Layer Perceptrons). We compare the merits and demerits of each learning method in the handwritten digit recognition. Although the speed of EBP learning is slower than other learning methods in the initial learning process, its generalization capability is better. Also, the speed of Cross Entropy function that makes up for the weak points of EBP learning is faster than that of EBP learning. But its generalization capability is worse because the error signal of the output layer trains the target vector linearly. The speed of LBL learning is the fastest speed among the other learning methods in the initial learning process. However, it can't train for more after a certain time, it has the lowest generalization capability. Therefore, this paper proposes the standard of selecting the learning method when we apply the MLP.
Keywords
다층퍼셉트론;오류역전파 학습;Cross Entropy함수;계층별 학습;
Citations & Related Records
연도 인용수 순위
  • Reference
1 J. M. Zurada, Introduction to Artificial Neural Systems, West Publishing Co., 1992
2 Ali Rezgui and Nazif Tepedelenlioglu, 'The effect of the slope of the activation function on the back propagation algorithm,' Proceeding of IJCNN'90 Washington D.C., vol. 1, pp. 707-710
3 Plagianakos, V.P., Magoulas, G.D., Vrahatis, M.N., 'Deterministic nonmonotone strategies for effective training of multilayer perceptrons,' IEEE Trans. Neural Networks, vol. 13, pp. 1268-1284, 2002   DOI   ScienceOn
4 Lengelle, R., and Denoeux, T., 'Training MLPs Layer by Layer Using an Objective Function for Internal Representations,' Neural Networks, vol. 9, January, 1996
5 J. J. Hull, 'A database for handwritten text recognition research,' IEEE Trans. Pattern and Machine Intell., vol. 16, pp. 550-554, 1994   DOI   ScienceOn
6 Simon Haykin, Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Co., 1994
7 C M. Bishop, Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1997
8 Ampazis, N., Perantonis, S,J., "Two highly efficient second-order algorithms for training feedforward networks,' IEEE Trans. Neural Networks, vol. 13, pp. 1064-1074, Sep., 2002   DOI   ScienceOn
9 G.-J. Wang and C.-C. Chen, 'A Fast Multilayer Neural-Network Training Algorithm Based on the Layer-By-Layer Optimizing Procedures,' IEEE Trans. Neural Networks, vol. 7, pp. 768-775, May, 1996   DOI   ScienceOn
10 David J. Winter, Matrix Algebra, Macmillan Publishing Company, 1992
11 Jim. Y. F. Yam and Tommy W. S. Chow, 'Extended Least Squares Based Algorithm for Training Feedforward Networks,' IEEE Trans. Neural Networks, vol. 8, pp. 806-810, May, 1997   DOI   ScienceOn
12 D. E. Rumelhart and J. I. McCelland, Parallel Distributed Processing, MIT Press, Cambridge, MA, pp. 318-362, 1986
13 K. Hornik, M. Stinchcombe, and H. White, 'Multilayer feedforward networks are universal approximators,' Neural Networks, vol. 2, pp. 359-366, 1989   DOI   ScienceOn
14 M M Islam and K Murase, 'A new algorithm to design compact two-hidden-layer artificial neural networks,' Neural Netwoks, vol. 14, 2001
15 R. P. Lippmann, 'An Introduction to Computing with Neural Nets,' IEEE ASSP Magazine, vol. 4, no. 2, pp. 4-22, April 1987   DOI   ScienceOn
16 J. R. Chen and P. Mars, 'Stepsize variation methods for accelerating the backpropation algorithm,' Proc. IJCNN Jan. 15-19, 1990, Washington, DC, USA, vol. I, pp. 601-604
17 J. Villiers and E. Barnard, 'Backpropagation Neural Nets with One and Two Hidden Layers,' IEEE Trans. Neural Netwoks, vol. 4, no. 1, pp. 136-141, 1993   DOI   ScienceOn
18 A. Van Ooyen and B. Nienhuis, 'Improving the convergence of the back-propagation algorithm,' Neural Networks, vol. 78, pp. 465-471, 1992
19 Shah, J.V., Chi-Sang Poon, 'Linear independence of internal representations in multilayer perceptrons,' IEEE Trans. Neural Netwoks, vol. 10, no. 1, pp. 10-18, 1999   DOI   ScienceOn