• 제목/요약/키워드: Feedforward neural networks (FFNN)

검색결과 1건 처리시간 0.017초

Efficient Markov Chain Monte Carlo for Bayesian Analysis of Neural Network Models

  • Paul E. Green;Changha Hwang;Lee, Sangbock
    • Journal of the Korean Statistical Society
    • /
    • 제31권1호
    • /
    • pp.63-75
    • /
    • 2002
  • Most attempts at Bayesian analysis of neural networks involve hierarchical modeling. We believe that similar results can be obtained with simpler models that require less computational effort, as long as appropriate restrictions are placed on parameters in order to ensure propriety of posterior distributions. In particular, we adopt a model first introduced by Lee (1999) that utilizes an improper prior for all parameters. Straightforward Gibbs sampling is possible, with the exception of the bias parameters, which are embedded in nonlinear sigmoidal functions. In addition to the problems posed by nonlinearity, direct sampling from the posterior distributions of the bias parameters is compounded due to the duplication of hidden nodes, which is a source of multimodality. In this regard, we focus on sampling from the marginal posterior distribution of the bias parameters with Markov chain Monte Carlo methods that combine traditional Metropolis sampling with a slice sampler described by Neal (1997, 2001). The methods are illustrated with data examples that are largely confined to the analysis of nonparametric regression models.