DOI QR코드

DOI QR Code

Family of Cascade-correlation Learning Algorithm

캐스케이드-상관 학습 알고리즘의 패밀리

  • 최명복 (국립 원주대학 행정전산과ㆍ여성교양과) ;
  • 이상운 (국립 원주대학 행정전산과ㆍ여성교양과)
  • Published : 2005.02.01

Abstract

The cascade-correlation (CC) learning algorithm of Fahlman and Lebiere is one of the most influential constructive algorithm in a neural network. Cascading the hidden neurons results in a network that can represent very strong nonlinearities. Although this power is in principle useful, it can be a disadvantage if such strong nonlinearity is not required to solve the problem. 3 models are presented and compared empirically. All of them are based on valiants of the cascade architecture and output neurons weights training of the CC algorithm. Empirical results indicate the followings: (1) In the pattern classification, the model that train only new hidden neuron to output layer connection weights shows the best predictive ability; (2) In the function approximation, the model that removed input-output connection and used sigmoid-linear activation function is better predictability than CasCor algorithm.

Fahlman과 Lebiere의 캐스케이드-상관 (CC) 학습 알고리즘은 신경망의 구성 알고리즘에서 가장 널리 사용되는 것 중의 하나이며, 망에서 은닉 뉴런을 캐스케이드 형태로 취함으로서 매우 강력한 비선형을 표현할 수 있다. 비록 이 멱승이 유용할지 몰라도 대체로 문제를 푸는데는 강력한 비선형성이 요구되지 않으며 단점이 될 수도 있다. CC 알고리즘의 캐스케이드 구조 및 출력 뉴런의 가중치 훈련에 대한 변형된 형태인 3개 모델이 제안되고 경험적으로 비교되었다. 실험결과 다음과 같은 결론을 얻었다: (1) 패턴분류에 있어서, 새로 추가되는 은닉 뉴런과 출력층간 연결강도만 훈련시키는 모델이 가장 좋은 예측력을 나타내었다; (2) 함수근사 문제에 있어서는 입력-출력 연결강도를 제거하고 시그모이드-선형 작동함수를 사용하는 모델이 CasCor 알고리즘보다 좋은 결과를 나타내었다.

Keywords

References

  1. T-Y. Kwok and D-Y. Yeung, 'Constructive Algorithms for Structure Learning in Feedforward Neural Networks for Regression Problems,' IEEE Trans. on Neural Networks, Vol. 8, No.3, pp. 630-645, 1997 https://doi.org/10.1109/72.572102
  2. J. Ghosh and K. Tumer, 'Structural Adaptation and Generalization in Supervised Feed-forward Networks,' Journal of Artificial Neural Networks, Vol. 1, No. 4, pp. 431 - 458, 1994
  3. J. Moody, 'Prediction Risk and Architecture Selection for Neural Networks,' Theory and Pattern Recognition Applications, NATO ASI Series, F, pp. 147-165, Springer-Verlag, 1994
  4. S. E. Fahlman and C. Lebiere, 'The Cascade Correlation Learning Architecture,' Advances in Neural Information Processing Systems II, pp. 525-532, 1990
  5. L. Prechelt, 'Investigation of the CasCor Family of Learning Algorithms,' Neural Networks, Vol. 10, No. 5, pp. 885 - 896, 1997 https://doi.org/10.1016/S0893-6080(96)00115-3
  6. M. Lehtokangas, 'Modeling with Constructive Backpropagation,' Neural Networks, Vol. 12, pp. 707-716, 1999 https://doi.org/10.1016/S0893-6080(99)00018-0
  7. C. Littmann and H. Ritter, 'Cascade Network Architectures,' Proc. Intern. Joint Conference on Neural Networks, Vol. II, pp. 398-404, 1992
  8. R. S. Crowder, 'CASCOR: Lisp and C Implementations of CasCade Correlation,' ftp://ftp.cs.cmu.edu/afs/cs.cmu.edu/project/connect/code$^{}$ported/
  9. Cascor1, 'http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/cascor/c/cascor1.c
  10. T. Ash, 'Dynamic Node Creation in Backpropagation Neural Networks,' Connection Science, Vol. 1, No.4, pp. 365-375, 1989 https://doi.org/10.1080/09540098908915647