• Title/Summary/Keyword: 로버스트 역전파 알고리즘

Search Result 5, Processing Time 0.021 seconds

회귀분석을 위한 로버스트 신경망

  • 황창하;김상민;박희주
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.2
    • /
    • pp.327-332
    • /
    • 1997
  • 다층 신경망은 비모수 회귀함수 추정의 한 방법이다. 다충 신경망을 학습시키기 위해 역전파 알고리즘이 널리 사용되고 있다. 그러나 이 알고리즘은 이상치에 매우 민감하여 이상치를 포함하고 있는 자료에 대하여 원하지 않는 회귀함수를 추정한다. 본 논문에서는 통계물리에서 자주 사용하는 방법을 이용하여 로버스트 역전파 알고리즘을 제안하고 수학적으로 신경망과 매우 유사한 PRP(projection pursuit regression) 방법, 일반적인 역전파 알고리즘과 모의실험을 통해 비교 분석한다.

  • PDF

Pattern Recognition using Robust Feedforward Neural Networks (로버스트 다층전방향 신경망을 이용한 패턴인식)

  • Hwang, Chang-Ha;Kim, Sang-Min
    • Journal of the Korean Data and Information Science Society
    • /
    • v.9 no.2
    • /
    • pp.345-355
    • /
    • 1998
  • The back propagation(BP) algorithm allows multilayer feedforward neural networks to learn input-output mappings from training samples. It iteratively adjusts the network parameters(weights) to minimize the sum of squared approximation errors using a gradient descent technique. However, the mapping acquired through the BP algorithm may be corrupt when errorneous training data are employed. In this paper two types of robust backpropagation algorithms are discussed both from a theoretical point of view and in the case studies of nonlinear regression function estimation and handwritten Korean character recognition. For future research we suggest Bayesian learning approach to neural networks and compare it with two robust backpropagation algorithms.

  • PDF

A Robust Backpropagation Algorithm and It's Application (문자인식을 위한 로버스트 역전파 알고리즘)

  • Oh, Kwang-Sik;Kim, Sang-Min;Lee, Dong-No
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.163-171
    • /
    • 1997
  • Function approximation from a set of input-output pairs has numerous applications in scientific and engineering areas. Multilayer feedforward neural networks have been proposed as a good approximator of nonlinear function. The back propagation(BP) algorithm allows multilayer feedforward neural networks to learn input-output mappings from training samples. It iteratively adjusts the network parameters(weights) to minimize the sum of squared approximation errors using a gradient descent technique. However, the mapping acquired through the BP algorithm may be corrupt when errorneous training data we employed. When errorneous traning data are employed, the learned mapping can oscillate badly between data points. In this paper we propose a robust BP learning algorithm that is resistant to the errorneous data and is capable of rejecting gross errors during the approximation process, that is stable under small noise perturbation and robust against gross errors.

  • PDF

Robust Error Measure for Back Propagation Algorithm (로버스트 역전파 알고리즘을 위한 오차함수)

  • 김현철;이철원
    • The Korean Journal of Applied Statistics
    • /
    • v.12 no.2
    • /
    • pp.505-515
    • /
    • 1999
  • 인공신경망 모형을 적합시키는데 사용하는 역전파 알고리즘을 로버스트하게 만드는 새로운 오차함수를 제안했으며, 새 방법의 성능을 확인하기 위해 Liano가 제안한 방법에 따라 모의실험을 수행했다. 실험결과 새 방법은 LMS방법만큼 안정적이었으며, Liano의 LMLS방법보다 더 로버스트했다. 또 실제 사례를 분석함으로써 이 방법이 의미있는 방법임을 보였다. 새 방법은 특히 오차가 없거나 작은 오차를 갖는 표본에 대해서도 좋은 성질을 가짐으로서 대형오차의 유무에 관계없이 항상 사용할 수 있는 방법으로 판명되었다.

  • PDF

A Robust Propagation Algorithm for Function Approximation (함수근사를 위한 로버스트 역전파 알고리즘)

  • Kim, Sang-Min;Hwang, Chang-Ha
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.3
    • /
    • pp.747-753
    • /
    • 1997
  • Function approximation from a set of input-output parirs has numerous applications in scientiffc and engineer-ing areas.Multiayer feedforward neural networks have been proposed as a good approximator of noninear function.The back propagation (BP) algorithm allows muktiayer feedforward neural networks oro learn input-output mappongs from training samples.However, the mapping acquired through the BP algorithm nay be cor-rupt when errorneous trauning data are employed.In this paper we propose a robust BP learning algorithm that is resistant to the errormeous data and is capable of rejecting gross errors during the approximation process.

  • PDF