Browse > Article

Comparison of Factors for Controlling Effects in MLP Networks  

윤여창 (우석대학교 전산통계학과)
Abstract
Multi-Layer Perceptron network has been mainly applied to many practical problems because of its nonlinear mapping ability. However the generalization ability of MLP networks may be affected by the number of hidden nodes, the initial values of weights and the training errors. These factors, if improperly chosen, may result in poor generalization ability of MLP networks. It is important to identify these factors and their interaction in order to control effectively the generalization ability of MLP networks. In this paper, we have empirically identified the factors that affect the generalization ability of MLP networks, and compared their relative effects on the generalization performance for the conventional and visualized weight selecting methods using the controller box.
Keywords
Multi-Layer Perceptron; controller box; complexity control;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Cherkassky, V. and Mulier, F., 'Learning from data-Concepts, Theory and Methods,' Wiley, New York, 1998
2 Vapnik, V., 'The Nature of Statistical Learning Theory,' Wiley, New York, 1995
3 Zhong, S., and Cherkassky, V., 'Factors Controlling Generalization Ability od MLP Networks,' Proceedings of International Joint Conference On Neural Networks, 1999   DOI
4 윤여창, '제어상자를 이용한 단순 신경망의 개선된 학습과정', 정보과학회논문지 : 소프트웨어 및 응용, 제28권, 제4호, pp.338-345, 2001   과학기술학회마을
5 Easton, G.S., 'A Simple Dynamic Graphical Diagnostic Method for Almost Any Model,' Journal of the American Statistical Association, 89, pp.201-207, 1994   DOI
6 Smith, M., 'Neural Networks for Statistical Modeling,' Van Nostrand Reinhold, New York, 1993
7 Kim, Y.K. and Ra, J.B., 'Weight value initialization for improving training speed in the back propagation network,' Proceedings of International Joint Conference On Neural Networks, Vol.3, pp.2396-2401, 1991
8 Cherkassky, V. and Shepherd, R. 'Regularization Effect of Weight Initialization in Back Propagation Networks,' Proceedings of International Joint Conference On Neural Networks, pp.2258-2261, 1998   DOI
9 Espinosa, C.H. and Redondo, M.F., 'Multilayer feedforward weight initialization,' Proceedings of International Joint Conference On Neural Networks, pp.166-170, 2001   DOI
10 Thimm, G. AND Fiesler, E., 'High-order and multilayer perceptron initialization,' IEEE Transactions on Neural Networks, Vol.8, pp.349-359, 1997   DOI   ScienceOn
11 Hagan, M.T., Demuth, H.B. and Beale, M., 'Neural Network Design,' PWS, Boston, 1995
12 Atiya, A. and Ji, C., 'How Initial Conditions Affect Generalization Performance in Large Networks,' IEEE Transaction on Neural Networks, Vol.8, No.2, pp.448-451, 1997   DOI   ScienceOn