• Title/Summary/Keyword: Delta Learning Method

Search Result 46, Processing Time 0.022 seconds

Auto-Tuning Method of Learning Rate for Performance Improvement of Backpropagation Algorithm (역전파 알고리즘의 성능개선을 위한 학습율 자동 조정 방식)

  • Kim, Joo-Woong;Jung, Kyung-Kwon;Eom, Ki-Hwan
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.39 no.4
    • /
    • pp.19-27
    • /
    • 2002
  • We proposed an auto-tuning method of learning rate for performance improvement of backpropagation algorithm. Proposed method is used a fuzzy logic system for automatic tuning of learning rate. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust learning rate. The inputs of fuzzy logic system are ${\Delta}$ and $\bar{{\Delta}}$, and the output is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on a N-parity problem, function approximation, and Arabic numerals classification. The results show that the proposed method has considerably improved the performance compared to the backpropagation, the backpropagation with momentum, and the Jacobs' delta-bar-delta.

Performance Improvement of Backpropagation Algorithm by Automatic Tuning of Learning Rate using Fuzzy Logic System

  • Jung, Kyung-Kwon;Lim, Joong-Kyu;Chung, Sung-Boo;Eom, Ki-Hwan
    • Journal of information and communication convergence engineering
    • /
    • v.1 no.3
    • /
    • pp.157-162
    • /
    • 2003
  • We propose a learning method for improving the performance of the backpropagation algorithm. The proposed method is using a fuzzy logic system for automatic tuning of the learning rate of each weight. Instead of choosing a fixed learning rate, the fuzzy logic system is used to dynamically adjust the learning rate. The inputs of fuzzy logic system are delta and delta bar, and the output of fuzzy logic system is the learning rate. In order to verify the effectiveness of the proposed method, we performed simulations on the XOR problem, character classification, and function approximation. The results show that the proposed method considerably improves the performance compared to the general backpropagation, the backpropagation with momentum, and the Jacobs'delta-bar-delta algorithm.

Learning Performance Improvement of Fuzzy RBF Network (퍼지 RBF 네트워크의 학습 성능 개선)

  • Kim Kwang-Baek
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.3
    • /
    • pp.369-376
    • /
    • 2006
  • In this paper, we propose an improved fuzzy RBF network which dynamically adjusts the rate of learning by applying the Delta-bar-Delta algorithm in order to improve the learning performance of fuzzy RBF networks. The proposed learning algorithm, which combines the fuzzy C-Means algorithm with the generalized delta learning method, improves its learning performance by dynamically adjusting the rate of learning. The adjustment of the learning rate is achieved by self-generating middle-layered nodes and by applying the Delta-bar-Delta algorithm to the generalized delta learning method for the learning of middle and output layers. To evaluate the learning performance of the proposed RBF network, we used 40 identifiers extracted from a container image as the training data. Our experimental results show that the proposed method consumes less training time and improves the convergence of teaming, compared to the conventional ART2-based RBF network and fuzzy RBF network.

  • PDF

Optimal Heating Load Identification using a DRNN (DRNN을 이용한 최적 난방부하 식별)

  • Chung, Kee-Chull;Yang, Hai-Won
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.48 no.10
    • /
    • pp.1231-1238
    • /
    • 1999
  • This paper presents an approach for the optimal heating load Identification using Diagonal Recurrent Neural Networks(DRNN). In this paper, the DRNN captures the dynamic nature of a system and since it is not fully connected, training is much faster than a fully connected recurrent neural network. The architecture of DRNN is a modified model of the fully connected recurrent neural network with one hidden layer. The hidden layer is comprised of self-recurrent neurons, each feeding its output only into itself. In this study, A dynamic backpropagation (DBP) with delta-bar-delta learning method is used to train an optimal heating load identifier. Delta-bar-delta learning method is an empirical method to adapt the learning rate gradually during the training period in order to improve accuracy in a short time. The simulation results based on experimental data show that the proposed model is superior to the other methods in most cases, in regard of not only learning speed but also identification accuracy.

  • PDF

Enhanced RBF Network by Using Auto- Turning Method of Learning Rate, Momentum and ART2

  • Kim, Kwang-baek;Moon, Jung-wook
    • Proceedings of the KAIS Fall Conference
    • /
    • 2003.11a
    • /
    • pp.84-87
    • /
    • 2003
  • This paper proposes the enhanced REF network, which arbitrates learning rate and momentum dynamically by using the fuzzy system, to arbitrate the connected weight effectively between the middle layer of REF network and the output layer of REF network. ART2 is applied to as the learning structure between the input layer and the middle layer and the proposed auto-turning method of arbitrating the learning rate as the method of arbitrating the connected weight between the middle layer and the output layer. The enhancement of proposed method in terms of learning speed and convergence is verified as a result of comparing it with the conventional delta-bar-delta algorithm and the REF network on the basis of the ART2 to evaluate the efficiency of learning of the proposed method.

  • PDF

Enhanced Backpropagation Algorithm by Auto-Tuning Method of Learning Rate using Fuzzy Control System (퍼지 제어 시스템을 이용한 학습률 자동 조정 방법에 의한 개선된 역전파 알고리즘)

  • 김광백;박충식
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.2
    • /
    • pp.464-470
    • /
    • 2004
  • We propose an enhanced backpropagation algorithm by auto-tuning of learning rate using fuzzy control system for performance improvement of backpropagation algorithm. We propose two methods, which improve local minima and loaming times problem. First, if absolute value of difference between target and actual output value is smaller than $\varepsilon$ or the same, we define it as correctness. And if bigger than $\varepsilon$, we define it as incorrectness. Second, instead of choosing a fixed learning rate, the proposed method is used to dynamically adjust learning rate using fuzzy control system. The inputs of fuzzy control system are number of correctness and incorrectness, and the output is the Loaming rate. For the evaluation of performance of the proposed method, we applied the XOR problem and numeral patterns classification The experimentation results showed that the proposed method has improved the performance compared to the conventional backpropagatiot the backpropagation with momentum, and the Jacob's delta-bar-delta method.

A Study on Enhanced Self-Generation Supervised Learning Algorithm for Image Recognition (영상 인식을 위한 개선된 자가 생성 지도 학습 알고리듬에 관한 연구)

  • Kim, Tae-Kyung;Kim, Kwang-Baek;Paik, Joon-Ki
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.2C
    • /
    • pp.31-40
    • /
    • 2005
  • we propose an enhanced self-generation supervised algorithm that by combining an ART algorithm and the delta-bar-delta method. Form the input layer to the hidden layer, ART-1 and ART-2 are used to produce nodes, respectively. A winner-take-all method is adopted to the connection weight adaption so that a stored pattern for some pattern is updated. we test the recognition of student identification, a certificate of residence, and an identifier from container that require nodes of hidden layers in neural network. In simulation results, the proposed self-generation supervised learning algorithm reduces the possibility of local minima and improves learning speed and paralysis than conventional neural networks.

Variation of activation functions for accelerating the learning speed of the multilayer neural network (다층 구조 신경회로망의 학습 속도 향상을 위한 활성화 함수의 변화)

  • Lee, Byung-Do;Lee, Min-Ho
    • Journal of Sensor Science and Technology
    • /
    • v.8 no.1
    • /
    • pp.45-52
    • /
    • 1999
  • In this raper, an enhanced learning method is proposed for improving the learning speed of the error back propagation learning algorithm. In order to cope with the premature saturation phenomenon at the initial learning stage, a variation scheme of active functions is introduced by using higher order functions, which does not need much increase of computation load. It naturally changes the learning rate of inter-connection weights to a large value as the derivative of sigmoid function abnormally decrease to a small value during the learning epoch. Also, we suggest the hybrid learning method incorporated the proposed method with the momentum training algorithm. Computer simulation results show that the proposed learning algorithm outperforms the conventional methods such as momentum and delta-bar-delta algorithms.

  • PDF

Container Image Recognition using Fuzzy-based Noise Removal Method and ART2-based Self-Organizing Supervised Learning Algorithm (퍼지 기반 잡음 제거 방법과 ART2 기반 자가 생성 지도 학습 알고리즘을 이용한 컨테이너 인식 시스템)

  • Kim, Kwang-Baek;Heo, Gyeong-Yong;Woo, Young-Woon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.7
    • /
    • pp.1380-1386
    • /
    • 2007
  • This paper proposed an automatic recognition system of shipping container identifiers using fuzzy-based noise removal method and ART2-based self-organizing supervised learning algorithm. Generally, identifiers of a shipping container have a feature that the color of characters is blacker white. Considering such a feature, in a container image, all areas excepting areas with black or white colors are regarded as noises, and areas of identifiers and noises are discriminated by using a fuzzy-based noise detection method. Areas of identifiers are extracted by applying the edge detection by Sobel masking operation and the vertical and horizontal block extraction in turn to the noise-removed image. Extracted areas are binarized by using the iteration binarization algorithm, and individual identifiers are extracted by applying 8-directional contour tacking method. This paper proposed an ART2-based self-organizing supervised learning algorithm for the identifier recognition, which improves the performance of learning by applying generalized delta learning and Delta-bar-Delta algorithm. Experiments using real images of shipping containers showed that the proposed identifier extraction method and the ART2-based self-organizing supervised learning algorithm are more improved compared with the methods previously proposed.

A Study on Auto-Tuning Method of learning Rate by Using Fuzzy Logic System (퍼지 논리 시스템을 이용한 학습률 자동 조정 방법에 관한 연구)

  • 주영호;김태영;김광백
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 2003.05a
    • /
    • pp.484-489
    • /
    • 2003
  • 본 논문에서는 역전파 알고리즘의 성능 개선을 위해 퍼지 논리 시스템을 이용한 학습률 자동 조정 방법을 제안한다. 제안된 방법은 목표값과 출력값의 차이에 대한 절대값이 $\varepsilon$ 보다 적거나 같으면 정확성으로 분류하고 크면 부정확성으로 분류한다. 정확성의 총 개수를 퍼지 논리 시스템에 적용하여 학습률과 모멘텀을 동적으로 조정한다. 제안된 방법을 XOR 문제와 숫자패턴 문제에 적용하여 실험한 결과, 기존의 역전파 알고리즘, 모멘텀 방식, Jacob의 delta-bar-delta 방식보다 성능이 개선됨을 확인하였다.

  • PDF