제어로봇시스템학회:학술대회논문집
- 2000.10a
- /
- Pages.484-484
- /
- 2000
Structure Minimization using Impact Factor in Neural Networks
- Seo, Kap-Ho (Dept. of Electrical Engineering and Computer Science, Korea Advanced Institute of Science and Technology) ;
- Song, Jae-Su (Dept. of Electrical Engineering and Computer Science, Korea Advanced Institute of Science and Technology) ;
- Lee, Ju-Jang (Dept. of Electrical Engineering and Computer Science, Korea Advanced Institute of Science and Technology)
- Published : 2000.10.01
Abstract
The problem of determining the proper size of an neural network is recognized to be crucial, especially for its practical implications in such important issues as learning and generalization. Unfortunately, it usually is not obvious what size is best: a system that is too snail will not be able to learn the data while one that is just big enough may learn the slowly and be very sensitive to initial conditions and learning parameters. One popular technique is commonly known as pruning and consists of training a larger than necessary network and then removing unnecessary weights/nodes. In this paper, a new pruning method is developed, based on the penalty-term methods. This method makes the neural network good for the generalization and reduces the retraining time after pruning weights/nodes.
Keywords