• Title/Summary/Keyword: Constrained back propagation algorithm

Search Result 4, Processing Time 0.023 seconds

The Constrained Least Mean Square Error Method (제한 최소 자승오차법)

  • 나희승;박영진
    • Journal of KSNVE
    • /
    • v.4 no.1
    • /
    • pp.59-69
    • /
    • 1994
  • A new LMS algorithm titled constrained LMS' is proposed for problems with constrained structure. The conventional LMS algorithm can not be used because it destroys the constrained structures of the weights or parameters. Proposed method uses error-back propagation, which is popular in training neural networks, for error minimization. The illustrative examplesare shown to demonstrate the applicability of the proposed algorithm.

  • PDF

A study on the active noise control using generalized CLMS (일반화된 제한 최소자승법을 이용한 능동 소음제어에 관한 연구)

  • 나희승;박영진
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1993.10a
    • /
    • pp.52-57
    • /
    • 1993
  • Conventional active control algorithm for duct system is developed without considering problems of constrained structure. Therefore it destroys the constrained structures of the weights or parameters. A new LMS algorithm, which does keep the constraints, is proposed for systems with known constrained structure. It is based on error-back propagation. The stability analysis and simulation example are also included.

  • PDF

Long-term Prediction of Speech Signal Using a Neural Network (신경 회로망을 이용한 음성 신호의 장구간 예측)

  • 이기승
    • The Journal of the Acoustical Society of Korea
    • /
    • v.21 no.6
    • /
    • pp.522-530
    • /
    • 2002
  • This paper introduces a neural network (NN) -based nonlinear predictor for the LP (Linear Prediction) residual. To evaluate the effectiveness of the NN-based nonlinear predictor for LP-residual, we first compared the average prediction gain of the linear long-term predictor with that of the NN-based nonlinear long-term predictor. Then, the effects on the quantization noise of the nonlinear prediction residuals were investigated for the NN-based nonlinear predictor A new NN predictor takes into consideration not only prediction error but also quantization effects. To increase robustness against the quantization noise of the nonlinear prediction residual, a constrained back propagation learning algorithm, which satisfies a Kuhn-Tucker inequality condition is proposed. Experimental results indicate that the prediction gain of the proposed NN predictor was not seriously decreased even when the constrained optimization algorithm was employed.

Rotation-invariant pattern recognition system with constrained neural network (회전량에 불변인 제한 신경회로망을 이용한 패턴인식)

  • 나희승;박영진
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10a
    • /
    • pp.619-623
    • /
    • 1992
  • In pattern recognition, the conventional neural networks contain a large number of weights and require considerable training times and preprocessor to classify a transformed patterns. In this paper, we propose a constrained pattern recognition method which is insensitive to rotation of input pattern by various degrees and does not need any preprocessing. Because these neural networks can not be trained by the conventional training algorithm such as error back propagation, a novel training algorithm is suggested. As such a system is useful in problem related to calssify overse side and reverse side of 500 won coin. As an illustrative example, identification problem of overse and reverse side of 500 won coin is shown.

  • PDF