An Information-theoretic Approach for Value-Based Weighting in Naive Bayesian Learning

나이브 베이시안 학습에서 정보이론 기반의 속성값 가중치 계산방법

  • Received : 2010.05.11
  • Accepted : 2010.10.04
  • Published : 2010.12.15

Abstract

In this paper, we propose a new paradigm of weighting methods for naive Bayesian learning. We propose more fine-grained weighting methods, called value weighting method, in the context of naive Bayesian learning. While the current weighting methods assign a weight to an attribute, we assign a weight to an attribute value. We develop new methods, using Kullback-Leibler function, for both value weighting and feature weighting in the context of naive Bayesian. The performance of the proposed methods has been compared with the attribute weighting method and general naive bayesian. The proposed method shows better performance in most of the cases.

본 연구에서는 나이브 베이시안 학습의 환경에서 속성의 가중치를 계산하는 새로운 방식을 제안한다. 기존 방법들이 속성에 가중치를 부여하는 방식인데 반하여 본 연구에서는 한걸음 더 나아가 속성의 값에 가중치를 부여하는 새로운 방식을 연구하였다. 이러한 속성값의 가중치를 계산하기 위하여 Kullback-Leibler 함수를 이용하여 가중치를 계산하는 방식을 제안하였고 이러한 가중치들의 특성을 분석하였다. 제안된 알고리즘은 다수의 데이터를 이용하여 속성 가중치 방식과 비교하였고 대부분의 경우에 더 좋은 성능을 제공함을 알 수 있었다.

Keywords

References

  1. Pedro Domingos and Michael Pazzani. On the optimality of the simple bayesian classifier under zero-one loss. Machine Learning, 29(2-3), 1997.
  2. Pat Langley and Stephanie Sage. Induction of selective bayesian classifiers. In Proceedings of the Tenth Conference on Uncertainty in Articial Intelligence, pp.399-406, 1994.
  3. Claire Cardie and Nicholas Howe. Improving minority class prediction using case-specific feature weights. In Proceedings of the Fourteenth International Conference on Machine Learning, pp.57-65, 1997.
  4. Ron Kohavi and George H. John. Wrappers for feature subset selection. Articial Intelligence, vol.97, no.1-2, pp.273-324, 1997. https://doi.org/10.1016/S0004-3702(97)00043-X
  5. C. A. Ratanamahatana and D. Gunopulos. Feature selection for the naive bayesian classier using decision trees. Applied Articial Intelligence, vol.17, no.5,6, pp.475-487, 2003. https://doi.org/10.1080/713827175
  6. Thomas Gartner and Peter A. Flach. Wbcsvm: Weighted bayesian classification based on support vector machines. In ICML '01: Proceedings of the Eighteenth International Conference on Machine Learning. 200l.
  7. Mark Hall. A decision tree-based attribute weighting filter for naive bayes. Knowledge-Based Systems, vol.20, no.2, 2007. 13.
  8. Harry Zhang and Shengli Sheng. Learning weighted naive bayes with accurate ranking. In ICDM '04: Proceedings of the Fourth IEEE International Conference on Data Mining, 2004.
  9. Dietrich Wettschereck, David W. Aha, and Takao Mohri. A review and empirical evaluation of feature weighting methods for a class of lazy learning algorithms. Articial Intelligence Review, vol.11, pp.273-314, 1997. https://doi.org/10.1023/A:1006593614256
  10. J. Ross Quinlan. C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA. 1993.
  11. Peter Clark and Robin Boswell. Rule induction with cn2: some recent improvements. In EWSL-91: Proceedings of the European working session on learning on Machine learning, pp.151-163, 1991.
  12. S. Kullback and R. A. Leibler. On information and suciency. The Annals of Mathematical Statistics, vol.22, no.1, pp.79-86, 1951. https://doi.org/10.1214/aoms/1177729694
  13. C. Merz, P. Murphy, and D. Aha. UCI repository of machine learning databases. 1997.
  14. U. Fayyad and K. Irani. Multi-interval discretization of continuous-valued attributes for classification learning. Proceedings of Thirteenth International Joint Conference on Artificial Intelligence. Morgan Kaufmann, 1993.