Bayesian Neural Network with Recurrent Architecture for Time Series Prediction

  • Hong, Chan-Young (Department of Electrical and Electronic Engineering, Yonsei University) ;
  • Park, Jung-Hun (Department of Electrical and Electronic Engineering, Yonsei University) ;
  • Yoon, Tae-Sung (Department of Electrical Engineering, Changwon National University) ;
  • Park, Jin-Bae (Department of Electrical and Electronic Engineering, Yonsei University)
  • 발행 : 2004.08.25

초록

In this paper, the Bayesian recurrent neural network (BRNN) is proposed to predict time series data. Among the various traditional prediction methodologies, a neural network method is considered to be more effective in case of non-linear and non-stationary time series data. A neural network predictor requests proper learning strategy to adjust the network weights, and one need to prepare for non-linear and non-stationary evolution of network weights. The Bayesian neural network in this paper estimates not the single set of weights but the probability distributions of weights. In other words, we sets the weight vector as a state vector of state space method, and estimates its probability distributions in accordance with the Bayesian inference. This approach makes it possible to obtain more exact estimation of the weights. Moreover, in the aspect of network architecture, it is known that the recurrent feedback structure is superior to the feedforward structure for the problem of time series prediction. Therefore, the recurrent network with Bayesian inference, what we call BRNN, is expected to show higher performance than the normal neural network. To verify the performance of the proposed method, the time series data are numerically generated and a neural network predictor is applied on it. As a result, BRNN is proved to show better prediction result than common feedforward Bayesian neural network.

키워드