Multilayer Neural Network Using Delta Rule: Recognitron III

텔타규칙을 이용한 다단계 신경회로망 컴퓨터:Recognitron III

  • 김춘석 (숭실대 대학원 전기공학과) ;
  • 박충규 (숭실대 공대 전기공학과) ;
  • 이기한 (서울대 대학원 컴퓨터공학과) ;
  • 황희영 (서울대 공대 컴퓨터공학과)
  • Published : 1991.02.01

Abstract

The multilayer expanson of single layer NN (Neural Network) was needed to solve the linear seperability problem as shown by the classic example using the XOR function. The EBP (Error Back Propagation ) learning rule is often used in multilayer Neural Networks, but it is not without its faults: 1)D.Rimmelhart expanded the Delta Rule but there is a problem in obtaining Ca from the linear combination of the Weight matrix N between the hidden layer and the output layer and H, wich is the result of another linear combination between the input pattern and the Weight matrix M between the input layer and the hidden layer. 2) Even if using the difference between Ca and Da to adjust the values of the Weight matrix N between the hidden layer and the output layer may be valid is correct, but using the same value to adjust the Weight matrixd M between the input layer and the hidden layer is wrong. Recognitron III was proposed to solve these faults. According to simulation results, since Recognitron III does not learn the three layer NN itself, but divides it into several single layer NNs and learns these with learning patterns, the learning time is 32.5 to 72.2 time faster than EBP NN one. The number of patterns learned in a EBP NN with n input and output cells and n+1 hidden cells are 2**n, but n in Recognitron III of the same size. [5] In the case of pattern generalization, however, EBP NN is less than Recognitron III.

Keywords