Evolutionary Learning of Sigma-Pi Neural Trees and Its Application to classification and Prediction

시그마파이 신경 트리의 진화적 학습 및 이의 분류 예측에의 응용

  • Published : 1996.06.01

Abstract

The necessity and usefulness of higher-order neural networks have been well-known since early days of neurocomputing. However the explosive number of terms has hampered the design and training of such networks. In this paper we present an evolutionary learning method for efficiently constructing problem-specific higher-order neural models. The crux of the method is the neural tree representation employing both sigma and pi units, in combination with the use of an MDL-based fitness function for learning minimal models. We provide experimental results in classification and prediction problems which demonstrate the effectiveness of the method. I. Introduction topology employs one hidden layer with full connectivity between neighboring layers. This structure has One of the most popular neural network models been very successful for many applications. However, used for supervised learning applications has been the they have some weaknesses. For instance, the fully mutilayer feedforward network. A commonly adopted connected structure is not necessarily a good topology unless the task contains a good predictor for the full *d*dWs %BH%W* input space.

하이오더 신경망에 대한 필요성과 유용성에 대해서는 신경망 연구의 초기부터 잘 알려져 있다. 그러나 오더가 늘어남에 따라 항의 수가 급격히 증가하는 문제로 인하여 이러한 망을 설계하고 학습하는데 많은 어려움이 있었다. 본 논문에서는 문제에 적합한 하이오더 신경망 모델을 효율적으로 구성하기 위한 진화적 학습 방법을 제시한다. 이 방법에서는 시그마유닛과 파이유닛을 융합한 신경트리 표현을 사용한다. 또한 MDL기반의 적합도 분류 및 예측 문제에 있어서 제시된 방법의 유용성을 검증한다.

Keywords

References

  1. Neural Networks v.4 Dualistic geometry of the manifold of higher-order neurons S.Amari
  2. Evolutionary Computation v.1 An overview of evolutionary algorithms for parameter otimization T.B.ck;H.P.Schwefel
  3. IEEE Trans. on Sys. Man and Cyb. v.5 no.2 A learning identification algorithm and its application to an environmental system J.J.Duffy;M.A.Franklin
  4. Neural Computation v.1 Product units : a computationally powerful and biologically plausible extension to backpropagation networks R.Durbin;D.E.Rumelhart
  5. Applied Optics v.26 no.23 Learning,invariance, and generalization in high-order neural networks C.L.Giles;T.Maxwell
  6. Genetic algorithms in search optimization machine learning D.E.Goldberg
  7. Genetic programming : on the programming of computers by means of natural selection J.R.Koza
  8. in Proc. Third Int. Conf. on Genetic Algorithms(ICGA-89) Designing neural networks using genetic algorithms G.F.Miller;P.M.Todd;S.U.Hegde
  9. Perceptrons : an introduction to computational gemetry M.Minsky;S.Papert
  10. in Proc. Int. Joint Conf. Arificial Intelligence Tranining feedforward neuarl networks using genetic algorithms D.Montana;L.Davis
  11. Parallel Computing v.14 Limitations of muti-layer perceptron networks-steps towards genetic neural networks H.M.hlenbein
  12. in Connectionism in Perspective The dynamics of evolution and learning-Towards genetic neural networks H.M.Ienbein;J.indermann;R.Pfeifer(et al.)(ed.)
  13. in Proc. Int. Workshop on Combinations of Genetic Algoruthms and Neural Networks,IEEE Combinations of genetic algorithms and neural networks : A survey of the state of the art J.DSchaffer;D.Whitley;L.J.Eshelman
  14. Time Series Prediction A.S.Weigend(ed.);N.A.Gershenfeld(ed.)
  15. Parallel Computing v.14 Genetic algorithms and neural networks : optimizing connections and connectivity D.Whitley;T.Starkweather;C.Bogart
  16. Compelx Systems v.7 no.3 Evolving optimal neural networks using genetic algorithms with Occam's razor B.T.Zhang;H.M.hlenbein
  17. Evolutionary Computation v.3 no.1 Balancing accuracy and parsimony in genetic programming B.T.Zhang;H.Muhlenbein
  18. in Proc. IJCAI-95 Workshop on AI and the Environment Water pollution prediction with evolutionary neural trees B.T.Zhang;H.M.hlenbein
  19. in Proc. AAAI 1995 Fall Symposium on Genetic Programming MDL based fitness functions for learning parsimonious programs B.T.Zhang;H.M.hlenbein