DOI QR코드

DOI QR Code

Variational Bayesian inference for binary image restoration using Ising model

  • Jang, Moonsoo (Department of Statistics, Pusan National University) ;
  • Chung, Younshik (Department of Statistics, Pusan National University)
  • Received : 2021.02.12
  • Accepted : 2021.12.21
  • Published : 2022.01.31

Abstract

In this paper, the focus on the removal noise in the binary image based on the variational Bayesian method with the Ising model. The observation and the latent variable are the degraded image and the original image, respectively. The posterior distribution is built using the Markov random field and the Ising model. Estimating the posterior distribution is the same as reconstructing a degraded image. MCMC and variational Bayesian inference are two methods for estimating the posterior distribution. However, for the sake of computing efficiency, we adapt the variational technique. When the image is restored, the iterative method is used to solve the recursive problem. Since there are three model parameters in this paper, restoration is implemented using the VECM algorithm to find appropriate parameters in the current state. Finally, the restoration results are shown which have maximum peak signal-to-noise ratio (PSNR) and evidence lower bound (ELBO).

Keywords

References

  1. Besag J (1974). Spatial interaction and the statistical analysis of lattice systems, Journal of the Royal Statistical Society Series B, 48, 259-279.
  2. Besag J (1986). On the statistical analysis of dirty pictures, Journal of the Royal Statistical Society Series B, 48, 259-302.
  3. Besag J, York J, and Mollie A (1991). Bayesian image restoration, with two applications in spatial statistics, Annals of the Institute of Statistical Mathematics, 43, 1-59. https://doi.org/10.1007/BF00116466
  4. Bishop C (2006). Pattern Recognition and Machine Learning, Berlin, Heidelberg, Springer-Verlag.
  5. Blei DM, Kucukelbir A, and McAuliffe JD (2017). Variational inference: A review for Statisticians, Journal of the American Statistical Association, 112, 859-877. https://doi.org/10.1080/01621459.2017.1285773
  6. Casella G and George EI (1992). Explaining the Gibbs Sampler, The American Statistician, 46, 167-174. https://doi.org/10.2307/2685208
  7. Casella G (2001). Emperical Bayes Gibbs sampling, Biostatistics, 2, 485-500. https://doi.org/10.1093/biostatistics/2.4.485
  8. Dempster AP, Laird NM, and Rubin DB (1977). Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society Series B, 39, 1-38.
  9. Friedman N (2013). The Bayesian Structural EM Algorithm, arXiv preprint arXiv:1301.7373.
  10. Gelfand AE and Smith FM (1990). Sampling-based approaches to calculating marginal densities, Journal of the American Statistical Association, 85, 398-409. https://doi.org/10.1080/01621459.1990.10476213
  11. Geman S and Geman D (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images, IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721-741. https://doi.org/10.1109/TPAMI.1984.4767596
  12. Hastings WK (1970). Monte Carlo sampling methods using Markov Chains and their applications, Biometrika, 57, 97-109. https://doi.org/10.1093/biomet/57.1.97
  13. Jordan MI, Ghahramani Z, Jaakkola TS, and Saul LK (1999). An introduction to variational methods for graphical models, Machine Learning, Kluwer Academic Publishers, 37, 183-233. https://doi.org/10.1023/A:1007665907178
  14. Meng X and Rubin DB (1993). Maximum likelihood estimation via the ECM algorithm: A general framework, Biometrika, 80, 267-278. https://doi.org/10.1093/biomet/80.2.267
  15. Metropolis N, Rosenbluth AW, Rosenbluth MN, and Teller AH (1953). Equation of state calculations by fast computing machines, The Journal of Chemical Physics, 21.
  16. Nasios N and Bors AG (2006). Variational learning for Gaussian mixture models, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 36, 849-862. https://doi.org/10.1109/TSMCB.2006.872273
  17. Parisi G (1988). Statistical Field Theory, Redwood City, Addison-Wesley.
  18. Peterson C and Anderson JR (1987). A mean field theory learning algorithm for neural networks, Complex Systems, 1, 995-1019.
  19. Smith AFM and Roberts GO (1993). Bayesian computation via the Gibbs sampler and Related Markov Chain Monte Carlo Methods, Journal of the Royal Statistical Society Series B, 55, 3-23.
  20. Tian H, Shen T, Hao B, Hu Y, and Yang N (2009). Image restoration based on adaptive MCMC particle filter. 2009 2nd International Congress on Image and Signal Processing, IEEE, 1-5.
  21. Tierney L and Kadane JB (1986). Accurate approximations for posterior moments and marginal densities, Journal of American Statistical Association, 81, 82-86. https://doi.org/10.1080/01621459.1986.10478240
  22. Zhang C, Butepage J, Kjellstrom H, and Mandt S (2019). Advances in variational inference, IEEE Transactions on Pattern Analysis and Machine Intelligence, 41, 2008-2026. https://doi.org/10.1109/tpami.2018.2889774