DOI QR코드

DOI QR Code

EM 알고리즘 및 메타휴리스틱을 통한 다이나믹 환경에서의 베이지안 네트워크 학습 전파 프레임웍

Learning and Propagation Framework of Bayesian Network using Meta-Heuristics and EM algorithm considering Dynamic Environments

  • 추상현 (국립 금오공과대학교 산업공학부) ;
  • 이현수 (국립 금오공과대학교 산업공학부)
  • Choo, Sanghyun (School of Industrial Engineering, Kumoh National Institute of Technology) ;
  • Lee, Hyunsoo (School of Industrial Engineering, Kumoh National Institute of Technology)
  • 투고 : 2016.09.29
  • 심사 : 2016.10.19
  • 발행 : 2016.10.25

초록

기 구축되어있는 베이지안 네트워크에서 다이나믹한 환경 변화가 발생 할 때, 관련된 베이지안 네트워크의 파라미터는 새롭게 형성된 데이터의 패턴에 적응하여 새로운 파라미터로 변경되어야 한다. 이때, 새로운 파라미터는 베이지안 네트워크의 인과관계를 고려하여 변경되어야 한다. 본 논문에서는 Expectation Maximization(EM)알고리즘과 Meta-Heuristics 기법 중 하나인 Harmony Search(HS)알고리즘을 이용한 다이나믹한 파라미터 업데이트 프레임웍을 제안한다. 일반적으로, EM 알고리즘은 숨겨진 파라미터를 추정하는데 유효한 알고리즘이지만 지역 최적값에 수렴한다는 단점을 가지고 있다. 이 문제를 해결하기 위해서 본 논문은 Maximum Likelihood Estimator(MLE)의 파라미터가 글로벌 최적값을 지향하도록 하기위하여 메타휴리스틱 방법론의 하나인 HS를 적용한다. 제안된 방법은 EM 알고리즘의 단점을 보완하고 글로벌 최적값에 수렴하는 MLE의 파라미터를 추정하여 다이나믹하게 변화하는 환경에서도 사용 가능한 베이지안 네트워크의 학습 및 전파프레임웍을 제시한다.

When dynamics changes occurred in an existing Bayesian Network (BN), the related parameters embedding on the BN have to be updated to new parameters adapting to changed patterns. In this case, these parameters have to be updated with the consideration of the causalities in the BN. This research suggests a framework for updating parameters dynamically using Expectation Maximization (EM) algorithm and Harmony Search (HS) algorithm among several Meta-Heuristics techniques. While EM is an effective algorithm for estimating hidden parameters, it has a limitation that the generated solution converges a local optimum in usual. In order to overcome the limitation, this paper applies HS for tracking the global optimum values of Maximum Likelihood Estimators (MLE) of parameters. The proposed method suggests a learning and propagation framework of BN with dynamic changes for overcoming disadvantages of EM algorithm and converging a global optimum value of MLE of parameters.

키워드

참고문헌

  1. Jiang Su, Harry Zhang, Charles X. Ling, Stan Matwin, "Discriminative Parameter Learning for Bayesian Networks," Proceedings of the 25th International Conference on Machine Learning, pp. 1016-1023, 2008
  2. Bomin Choi, Jungsik Lee, Myung-Mook Han, "IDS Model using Improved Bayesian Network to improve the Intrusion Detection Rate", Journal of Korean Institute of Intelligent Systems, vol. 24, no.5, pp. 495-503, 2014 https://doi.org/10.5391/JKIIS.2014.24.5.495
  3. Sun-Jung Yeon, Hye-Jeong Hwang, Sang-Yong Lee, "Scheduling Management Agent using Bayesian Network based on Location Awareness", Journal of Korean Institute of Intelligent Systems, vol. 21, no.6, pp. 712-717, 2011 https://doi.org/10.5391/JKIIS.2011.21.6.712
  4. Woo-Young Chung, Eun-Tai Kim, "Context-aware application for smart home based on Bayesian network", Journal of Korean Institute of Intelligent Systems, vol. 17, no.2, pp. 179-184, 2007 https://doi.org/10.5391/JKIIS.2007.17.2.179
  5. Jeong-Sik Hwang, Su-Young Pi, Chang-Sik Son, Hwan-Mook Chung, "A Purchase Pattern Analysis Using Bayesian Network and Neural Network, Journal of Korean Institute of Intelligent Systems, vol. 15, no. 3, pp. 306-311, 2005 https://doi.org/10.5391/JKIIS.2005.15.3.306
  6. Jin-San Yang, Byoung-Tak Zhang, "Analysis of Web Customers Using Bayesian Belief Networks", Journal of Korean Institute of Intelligent Systems, vol. 11, no. 1, pp. 16-21, 2001
  7. Geum-Seong Hwang, Sung-Bae Cho, "Learning of Bayesian Network," Korea Robotics Society, vol. 3, no. 4, pp. 15-27, 2006
  8. G.F. Cooper, E. Herskovits, "A Bayesian method for the induction of probabilistic networks from data," Machine Learning, vol. 9, no. 4, pp. 309-348, 1992. https://doi.org/10.1023/A:1022649401552
  9. J. Rissanen, "Universal coding, information, prediction, and estimation," IEEE Transactions on Information Theory, vol. 30, no. 4, pp. 465-471, 1984.
  10. Wenhui Liao, Qiang Ji, "Learning Bayesian network parameters under incomplete data with domain knowledge," Pattern Recoginition, vol. 42, no.11, pp. 3046-3056, 2009 https://doi.org/10.1016/j.patcog.2009.04.006
  11. S. Geman, D. Geman, "Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of Images," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 6, no. 6 pp.721-741, 1984
  12. M. Ramoni, P. Sebastiani, "Robust learning with missing data," Machine Learning, vol. 45, no. 2, pp. 147-170, 2001 https://doi.org/10.1023/A:1010968702992
  13. G. Elidan, N. Friedman, "The Information Bottleneck EM Algorithm," In Proceedings of the 19th Conference on Uncertainty in Artificial Intelligence, pp. 200-209, 2003
  14. B. Thiesson, "Accelerated quantification of Bayesian networks with incomplete data," Proceedings of the First International Conference on Knowledge Discovery and Data Mining, pp. 306-311, 1995
  15. E. Bauer, D. Koller, Y. Singer, "Update rules for parameter estimation in Bayesian networks," In Proceedings of the 13th Conference on Uncertainty in Artificial Intelligence, pp. 3-13, 1997
  16. Richard E. Neapolitan, Learning Bayesian Networks, Prentice Hall, 2003
  17. Il-Chul Moon, K-Means Clustering and Gaussian Mixture Model, Lecture Note in Korea Advanced Institute of Science and Technology, 2010
  18. Z. W. Geem, J. H. Kim, G.V. Loganathan, "A new heuristic optimization algorithm: harmony search," Simulation, vol. 76, pp.60-68, 2001 https://doi.org/10.1177/003754970107600201
  19. X.S. Yang, "Harmony Search as a Metaheuristic Algorithm," Music-Inspired Harmony Search Algorithm: Theory and Applications (Editor Z.W. Geem), Studies in Computational Intelligence, Springer, vol. 191, pp. 1-14, 2009