Browse > Article
http://dx.doi.org/10.5391/JKIIS.2009.19.4.574

Stepwise Constructive Method for Neural Networks Using a Flexible Incremental Algorithm  

Park, Jin-Il (충북대학교 전자정보대학 제어로봇공학과)
Jung, Ji-Suk (충북대학교 전자정보대학 제어로봇공학과)
Cho, Young-Im (수원대학교 IT대학 컴퓨터학과)
Chun, Myung-Geun (충북대학교 전자정보대학 제어로봇공학과)
Publication Information
Journal of the Korean Institute of Intelligent Systems / v.19, no.4, 2009 , pp. 574-579 More about this Journal
Abstract
There have been much difficulties to construct an optimized neural network in complex nonlinear regression problems such as selecting the networks structure and avoiding overtraining problem generated by noise. In this paper, we propose a stepwise constructive method for neural networks using a flexible incremental algorithm. When the hidden nodes are added, the flexible incremental algorithm adaptively controls the number of hidden nodes by a validation dataset for minimizing the prediction residual error. Here, the ELM (Extreme Learning Machine) was used for fast training. The proposed neural network can be an universal approximator without user intervene in the training process, but also it has faster training and smaller number of hidden nodes. From the experimental results with various benchmark datasets, the proposed method shows better performance for real-world regression problems than previous methods.
Keywords
Flexible incremental algorithm; Stepwise constructive method; ELM(Extreme Learning Machine); Overtraining;
Citations & Related Records
연도 인용수 순위
  • Reference
1 G. B. Huang, Q. Y. Zhu, and C. K. Siew, 'Extreme learning machine: a new learning scheme of feedforward neural networks,' in Proc. Int. Joint Conf. Neural Networks(IJCNN2004), Vol. 2, pp. 985-990. 2004
2 H. T. Huynh and Y. Won, 'Small Number of Hidden Units for ELM with Two-stage Linear Model,' IEICE Trans. on Information and Systems, Vol. E91-D, No. 4, pp. 1042-1049, 2008   DOI   ScienceOn
3 T. Andersen and T. Martinez, 'Cross Validation and MLP Architecture Selection,' in Proc. of IEEE Int. Joint Conf. on Neural Networks IJCNN'99, CD Paper #192, 1999
4 F. Han and D. S. Huang, 'Improved Extreme Learning Machine for Function Approximation by Encoding a Priori Information,' Neurocomputing, Vol. 69, No. 16-18, pp. 2369-2373, 2006   DOI   ScienceOn
5 G. B. Huang and L. Chen, 'Enhanced Random Search based Incremental Extreme Learning Machine,' Neurocomputing, Vol. 71, pp. 3460-3468, 2008   DOI   ScienceOn
6 G. Castellano, A. M. Fanelli, and M. Pelillo, 'An Iterative Pruning Algorithm for Feedforward Neural Networks,' IEEE Trans. on Neural Networks, Vol. 8, No. 3, pp. 519-531, 1997   DOI   ScienceOn
7 G. B. Huang and L. Chen, 'Convex Incremental Extreme Learning Machine,' Neurocomputing, Vol. 70, pp. 3056-3062, 2007   DOI   ScienceOn
8 G. B. Huang, L. Chen, and C. Siew, 'Universal Approximation Using Incremental Constructive Feedforward Networks With Random Hidden Nodes,' IEEE Trans. on Neural Networks, Vol. 17, No. 4, pp. 879-892, 2006   DOI   ScienceOn
9 N. Liang, G. Huang, P. Saratchandran, and N. Sundararajan, 'A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks,' IEEE Trans. on Neural Networks, Vol. 17, No. 6, pp. 1411-1423, 2006   DOI   ScienceOn
10 C. Blake, C. Merz, UCI repository of machine learning databases, (http://www.ics.uci.edu/∼mlearn/MLRepository.html), Department of Information and Computer Sciences, University of California, Irvine, USA, 1998
11 조재훈, 이대종, 전명근, 'Bacterial Foraging Algorithm을 이용한 Extreme Learning Machine의 파라미터 최적화,' 한국지능시스템학회 논문지, Vol. 17, No. 6, pp. 807-812, 2007