• Title/Summary/Keyword: NETLA

Search Result 4, Processing Time 0.016 seconds

NETLA Based Optimal Synthesis Method of Binary Neural Network for Pattern Recognition

  • Lee, Joon-Tark
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.2
    • /
    • pp.216-221
    • /
    • 2004
  • This paper describes an optimal synthesis method of binary neural network for pattern recognition. Our objective is to minimize the number of connections and the number of neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm (NETLA) for the multilayered neural networks. The synthesis method in NETLA uses the Expanded Sum of Product (ESP) of the boolean expressions and is based on the multilayer perceptron. It has an ability to optimize a given binary neural network in the binary space without any iterative learning as the conventional Error Back Propagation (EBP) algorithm. Furthermore, NETLA can reduce the number of the required neurons in hidden layer and the number of connections. Therefore, this learning algorithm can speed up training for the pattern recognition problems. The superiority of NETLA to other learning algorithms is demonstrated by an practical application to the approximation problem of a circular region.

Optimal Synthesis of Binary Neural Network using NETLA (NETLA를 이용한 이진 신경회로망의 최적합성)

  • 정종원;성상규;지석준;최우진;이준탁
    • Proceedings of the Korean Society of Marine Engineers Conference
    • /
    • 2002.05a
    • /
    • pp.273-277
    • /
    • 2002
  • This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region and synthetic image having four class using a newly proposed learning algorithm. Our object is to minimize the number of connections and neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm(NETLA) based on the multilayer BNN. The synthesis method in the NETLA is based on the extension principle of Expanded and Truncated Learning (ETL) learning algorithm using the multilayer perceptron and is based on Expanded Sum of Product (ESP) as one of the boolean expression techniques. The number of the required neurons in hidden layer can be reduced and fasted for learning pattern recognition.. The superiority of this NETLA to other algorithms was proved by simulation.

  • PDF

Optimal Synthesis Method for Binary Neural Network using NETLA (NETLA를 이용한 이진 신경회로망의 최적 합성방법)

  • Sung, Sang-Kyu;Kim, Tae-Woo;Park, Doo-Hwan;Jo, Hyun-Woo;Ha, Hong-Gon;Lee, Joon-Tark
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2726-2728
    • /
    • 2001
  • This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region using a newly proposed learning algorithm[7] Our object is to minimize the number of connections and neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm(NETLA) for the multilayer BNN. The synthesis method in the NETLA is based on the extension principle of Expanded and Truncated Learning(ETL) and is based on Expanded Sum of Product (ESP) as one of the boolean expression techniques. And it has an ability to optimize the given BNN in the binary space without any iterative training as the conventional Error Back Propagation(EBP) algorithm[6] If all the true and false patterns are only given, the connection weights and the threshold values can be immediately determined by an optimal synthesis method of the NETLA without any tedious learning. Futhermore, the number of the required neurons in hidden layer can be reduced and the fast learning of BNN can be realized. The superiority of this NETLA to other algorithms was proved by the approximation problem of one circular region.

  • PDF

Binary Neural Network in Binary Space using NETLA (NETLA를 이용한 이진 공간내의 패턴분류)

  • Sung, Sang-Kyu;Park, Doo-Hwan;Jeong, Jong-Won;Lee, Joo-Tark
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.431-434
    • /
    • 2001
  • 단층 퍼셉트론이 처음 개발되었을 때, 간단한 패턴을 인식하는 학습 기능을 가지고 있기 장점 때문에 학자들의 관심을 끌었다. 단층 퍼셉트론은 한 개의 소자를 이용해서 이진 논리를 가중치(weight)의 변경만으로 모두 표현할 수 있는 장점 때문에 영상처리, 패턴인식, 장면인식 등에 이용되어 왔다. 최근에, 역전파학습(Back-Propagation Learning)알고리즘이 이진 공간내의 매핑 문제에 적용되고 있다. 그러나, 역전파 학습알고리즘은 연속공간 내에서 긴 학습시간과 비효율적인 수행의 문제를 가지고 있다. 일반적으로 역전파 학습 알고리즘은 간단한 이진 공간에서 매핑하기 위해서 많은 반복과정을 요구한다. 역전파 학습 알고리즘에서는 은닉층의 뉴런의 수는 주어진 문제를 해결하기 위해서 우선순위(prior)를 알지 못하기 때문에 입력층과 출력층내의 뉴런의 수에 의존한다. 따라서, 3층 신경회로망의 적용에 있어 가장 중요한 문제중의 하나는 은닉층내의 필요한 뉴런수를 결정하는 것이고, 회로망 합성과 가중치 결정에 대한 적절한 방법을 찾지 못해 실제로 그 사용 영역이 한정되어 있었다. 본 논문에서는 패턴 분류를 위한 새로운 학습방법을 제시한다. 훈련입력의 기하학적인 분석에 기반을 둔 이진 신경회로망내의 은닉층내의 뉴런의 수를 자동적으로 결정할 수 있는 NETLA(Newly Expand and Truncate Learning Algorithm)라 불리우는 기하학적 학습알고리즘을 제시하고, 시뮬레이션을 통하여, 제안한 알고리즘의 우수성을 증명한다.

  • PDF