Supervised Competitive Learning Neural Network with Flexible Output Layer

  • Cho, Seong-won (School of Electronic and Electrical Engineering, Hongik University)
  • Published : 2001.12.01

Abstract

In this paper, we present a new competitive learning algorithm called Dynamic Competitive Learning (DCL). DCL is a supervised learning method that dynamically generates output neurons and initializes automatically the weight vectors from training patterns. It introduces a new parameter called LOG (Limit of Grade) to decide whether an output neuron is created or not. If the class of at least one among the LOG number of nearest output neurons is the same as the class of the present training pattern, then DCL adjusts the weight vector associated with the output neuron to learn the pattern. If the classes of all the nearest output neurons are different from the class of the training pattern, a new output neuron is created and the given training pattern is used to initialize the weight vector of the created neuron. The proposed method is significantly different from the previous competitive learning algorithms in the point that the selected neuron for learning is not limited only to the winner and the output neurons are dynamically generated during the learning process. In addition, the proposed algorithm has a small number of parameters, which are easy to be determined and applied to real-world problems. Experimental results for pattern recognition of remote sensing data and handwritten numeral data indicate the superiority of DCL in comparison to the conventional competitive learning methods.

Keywords

References

  1. Parallel Distributed Processing Learning Internal Representations by Error Propagation D. E. Rummelhart;G. E. Hinton;R. J. Williams;Rummelhart, D. E.(ed.);McClelland, J. L.(ed.)
  2. Paralled Distributed Processing Feature Discovery by Competitive Learning D. E. Rummelhart;D. Zipser;Rummelhart, D. E.(ed.);McClelland, J. L.(ed.)
  3. Neural Networks v.3 Competitive Learning Algorithms for Vector Quantization S. C. Ahalt;A. K. Krishnamurthy;P. Chen;D. E. Melton
  4. Proc. IEEE International Conference on Neural Networks(ICNN-88) Adding a Conscience to Competitive Learning D. DeSieno
  5. Self-Organization and Associative Memory(3rd ed.) T. Kohonen
  6. Proceedings of IEEE v.78 Self Organizing Map T. Kohonen
  7. IEEE Transactions on Neural Networks v.1 no.1 Variants of Self-Organizing Maps J. A. Kangas;T. K. Kohonen;J. T. Laaksonen
  8. Neural Networks v.1 no.SUP.1 Learning Vector Quantization T. Kohonen
  9. Fundamentals of Neural Networks L. Fausett
  10. Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence B. Kosko
  11. Computer Vision, Graphics, and Image Processing v.37 A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine G. A. Carpenter;S. Grossberg
  12. Computer The ART of Adaptive Pattern Recognition by Self-Organization Neural Network G. A. Carpenter;S. Grossberg
  13. IEEE Transactions on Circuits and Systems v.40 no.9 Parallel, Self-Organizing, Hierarchical Neural Networks with Competitive Learning and Safe Rejection Schemes Seongwon Cho;Okan K. Ersoy;Mark Lehto
  14. Ph. D. Thesis, Purdue University Parallel, Self-Organizing, Hierarchical Neural Networks with Fuzzy Input Signal Representation, Competitive Learning and Safe Rejection Schemes Seongwon Cho