• Title/Summary/Keyword: Multi-layer back propagation

Search Result 132, Processing Time 0.024 seconds

The Structure of Boundary Decision Using the Back Propagation Algorithms (역전파 알고리즘을 이용한 경계결정의 구성에 관한 연구)

  • Lee, Ji-Young
    • The Journal of Information Technology
    • /
    • v.8 no.1
    • /
    • pp.51-56
    • /
    • 2005
  • The Back propagation algorithm is a very effective supervised training method for multi-layer feed forward neural networks. This paper studies the decision boundary formation based on the Back propagation algorithm. The discriminating powers of several neural network topology are also investigated against five manually created data sets. It is found that neural networks with multiple hidden layer perform better than single hidden layer.

  • PDF

A study on the realization of color printed material check using Error Back-Propagation rule (오류 역전파법으로구현한 컬러 인쇄물 검사에 관한 연구)

  • 한희석;이규영
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.10a
    • /
    • pp.560-567
    • /
    • 1998
  • This paper concerned about a imputed color printed material image in camera to decrease noise and distortion by processing median filtering with input image to identical condition. Also this paper proposed the way of compares a normal printed material with an abnormal printed material color tone with trained a learning of the error back-propagation to block classification by extracting five place from identical block(3${\times}$3) of color printed material R, G, B value. As a representative algorithm of multi-layer perceptron the error Back-propagation technique used to solve complex problems. However, the Error Back-propagation is algorithm which basically used a gradient descent method which can be converged to local minimum and the Back Propagation train include problems, and that may converge in a local minimum rather than get a global minimum. The network structure appropriate for a given problem. In this paper, a good result is obtained by improve initial condition and adjust th number of hidden layer to solve the problem of real time process, learning and train.

  • PDF

Efficient weight initialization method in multi-layer perceptrons

  • Han, Jaemin;Sung, Shijoong;Hyun, Changho
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1995.09a
    • /
    • pp.325-333
    • /
    • 1995
  • Back-propagation is the most widely used algorithm for supervised learning in multi-layer feed-forward networks. However, back-propagation is very slow in convergence. In this paper, a new weight initialization method, called rough map initialization, in multi-layer perceptrons is proposed. To overcome the long convergence time, possibly due to the random initialization of the weights of the existing multi-layer perceptrons, the rough map initialization method initialize weights by utilizing relationship of input-output features with singular value decomposition technique. The results of this initialization procedure are compared to random initialization procedure in encoder problems and xor problems.

  • PDF

Application of Back-propagation Algorithm for the forecasting of Temperature and Humidity (온도 및 습도의 단기 예측에 있어서 역전파 알고리즘의 적용)

  • Jeong, Hyo-Joon;Hwang, Won-Tae;Suh, Kyung-Suk;Kim, Eun-Han;Han, Moon-Hee
    • Journal of Environmental Impact Assessment
    • /
    • v.12 no.4
    • /
    • pp.271-279
    • /
    • 2003
  • Temperature and humidity forecasting have been performed using artificial neural networks model(ANN). We composed ANN with multi-layer perceptron which is 2 input layers, 2 hidden layers and 1 output layer. Back propagation algorithm was used to train the ANN. 6 nodes and 12 nodes in the middle layers were appropriate to the temperature model for training. And 9 nodes and 6 nodes were also appropriate to the humidity model respectively. 90% of the all data was used learning set, and the extra 10% was used to model verification. In the case of temperature, average temperature before 15 minute and humidity at present constituted input layer, and temperature at present constituted out-layer and humidity model was vice versa. The sensitivity analysis revealed that previous value data contributed to forecasting target value than the other variable. Temperature was pseudo-linearly related to the previous 15 minute average value. We confirmed that ANN with multi-layer perceptron could support pollutant dispersion model by computing meterological data at real time.

Learning of multi-layer perceptrons with 8-bit data precision (8비트 데이타 정밀도를 가지는 다층퍼셉트론의 역전파 학습 알고리즘)

  • 오상훈;송윤선
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.33B no.4
    • /
    • pp.209-216
    • /
    • 1996
  • In this paper, we propose a learning method of multi-layer perceptrons (MLPs) with 8-bit data precision. The suggested method uses the cross-entropy cost function to remove the slope term of error signal in output layer. To decrease the possibility of overflows, we use 16-bit weighted sum results into the 8-bit data with appropriate range. In the forwared propagation, the range for bit-conversion is determined using the saturation property of sigmoid function. In the backwared propagation, the range for bit-conversion is derived using the probability density function of back-propagated signal. In a simulation study to classify hadwritten digits in the CEDAR database, our method shows similar generalization performance to the error back-propagation learning with 16-bit precision.

  • PDF

Time Series Prediction Using a Multi-layer Neural Network with Low Pass Filter Characteristics (저주파 필터 특성을 갖는 다층 구조 신경망을 이용한 시계열 데이터 예측)

  • Min-Ho Lee
    • Journal of Advanced Marine Engineering and Technology
    • /
    • v.21 no.1
    • /
    • pp.66-70
    • /
    • 1997
  • In this paper a new learning algorithm for curvature smoothing and improved generalization for multi-layer neural networks is proposed. To enhance the generalization ability a constraint term of hidden neuron activations is added to the conventional output error, which gives the curvature smoothing characteristics to multi-layer neural networks. When the total cost consisted of the output error and hidden error is minimized by gradient-descent methods, the additional descent term gives not only the Hebbian learning but also the synaptic weight decay. Therefore it incorporates error back-propagation, Hebbian, and weight decay, and additional computational requirements to the standard error back-propagation is negligible. From the computer simulation of the time series prediction with Santafe competition data it is shown that the proposed learning algorithm gives much better generalization performance.

  • PDF

Hydrological Modelling of Water Level near "Hahoe Village" Based on Multi-Layer Perceptron

  • Oh, Sang-Hoon;Wakuya, Hiroshi
    • International Journal of Contents
    • /
    • v.12 no.1
    • /
    • pp.49-53
    • /
    • 2016
  • "Hahoe Village" in Andong region is an UNESCO World Heritage Site. It should be protected against various disasters such as fire, flooding, earthquake, etc. Among these disasters, flooding has drastic impact on the lives and properties in a wide area. Since "Hahoe Village" is adjacent to Nakdong River, it is important to monitor the water level near the village. In this paper, we developed a hydrological modelling using multi-layer perceptron (MLP) to predict the water level of Nakdong River near "Hahoe Village". To develop the prediction model, error back-propagation (EBP) algorithm was used to train the MLP with water level data near the village and rainfall data at the upper reaches of the village. After training with data in 2012 and 2013, we verified the prediction performance of MLP with untrained data in 2014.

A Study of the Automatic Berthing System of a Ship Using Artificial Neural Network (인공신경망을 이용한 선박의 자동접안 제어에 관한 연구)

  • Bae, Cheol-Han;Lee, Seung-Keon;Lee, Sang-Eui;Kim, Ju-Han
    • Journal of Navigation and Port Research
    • /
    • v.32 no.8
    • /
    • pp.589-596
    • /
    • 2008
  • In this paper, Artificial Neural Network(ANN) is applied to automatic berthing control for a ship. ANN is suitable for a maneuvering such as ship's berthing, because it can describe non-linearity of the system. Multi-layer perceptron which has more than one hidden layer between input layer and output layer is applied to ANN. Using a back-propagation algorithm with teaching data, we trained ANN to get a minimal error between output value and desired one. For the automatic berthing control of a containership, we introduced low speed maneuvering mathematical models. The berthing control with the structure of 8 input layer units in ANN is compared to 6 input layer units. From the simulation results, the berthing conditions are satisfied, even though the berthing paths are different.

Searching a global optimum by stochastic perturbation in error back-propagation algorithm (오류 역전파 학습에서 확률적 가중치 교란에 의한 전역적 최적해의 탐색)

  • 김삼근;민창우;김명원
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.3
    • /
    • pp.79-89
    • /
    • 1998
  • The Error Back-Propagation(EBP) algorithm is widely applied to train a multi-layer perceptron, which is a neural network model frequently used to solve complex problems such as pattern recognition, adaptive control, and global optimization. However, the EBP is basically a gradient descent method, which may get stuck in a local minimum, leading to failure in finding the globally optimal solution. Moreover, a multi-layer perceptron suffers from locking a systematic determination of the network structure appropriate for a given problem. It is usually the case to determine the number of hidden nodes by trial and error. In this paper, we propose a new algorithm to efficiently train a multi-layer perceptron. OUr algorithm uses stochastic perturbation in the weight space to effectively escape from local minima in multi-layer perceptron learning. Stochastic perturbation probabilistically re-initializes weights associated with hidden nodes to escape a local minimum if the probabilistically re-initializes weights associated with hidden nodes to escape a local minimum if the EGP learning gets stuck to it. Addition of new hidden nodes also can be viewed asa special case of stochastic perturbation. Using stochastic perturbation we can solve the local minima problem and the network structure design in a unified way. The results of our experiments with several benchmark test problems including theparity problem, the two-spirals problem, andthe credit-screening data show that our algorithm is very efficient.

  • PDF

A Simple Approach of Improving Back-Propagation Algorithm

  • Zhu, H.;Eguchi, K.;Tabata, T.;Sun, N.
    • Proceedings of the IEEK Conference
    • /
    • 2000.07b
    • /
    • pp.1041-1044
    • /
    • 2000
  • The enhancement to the back-propagation algorithm presented in this paper has resulted from the need to extract sparsely connected networks from networks employing product terms. The enhancement works in conjunction with the back-propagation weight update process, so that the actions of weight zeroing and weight stimulation enhance each other. It is shown that the error measure, can also be interpreted as rate of weight change (as opposed to ${\Delta}W_{ij}$), and consequently used to determine when weights have reached a stable state. Weights judged to be stable are then compared to a zero weight threshold. Should they fall below this threshold, then the weight in question is zeroed. Simulation of such a system is shown to return improved learning rates and reduce network connection requirements, with respect to the optimal network solution, trained using the normal back-propagation algorithm for Multi-Layer Perceptron (MLP), Higher Order Neural Network (HONN) and Sigma-Pi networks.

  • PDF