PERFORMANCE EVALUATION OF INFORMATION CRITERIA FOR THE NAIVE-BAYES MODEL IN THE CASE OF LATENT CLASS ANALYSIS: A MONTE CARLO STUDY

  • Dias, Jose G. (Department of Quantitative Methods and GIESTA-UNIDE, Higher Institute of Social Sciences and Business Studies-ISCTE)
  • 발행 : 2007.09.30

초록

This paper addresses for the first time the use of complete data information criteria in unsupervised learning of the Naive-Bayes model. A Monte Carlo study sets a large experimental design to assess these criteria, unusual in the Bayesian network literature. The simulation results show that complete data information criteria underperforms the Bayesian information criterion (BIC) for these Bayesian networks.

키워드

참고문헌

  1. AKAIKE, H. (1974). 'A new look at the statistical model identification', IEEE Transactions on Automatic Control, 19, 716-723 https://doi.org/10.1109/TAC.1974.1100705
  2. BIERNACKI, C. AND GOVAERT, G. (1997). 'Using the classification likelihood to choose the number of clusters', Computing Science and Statistics, 29, 451-457
  3. BIERNACKI, C., CELEUX, G. AND GOVAERT, G. (1999). 'An improvement of the NEC criterion for assessing the number of clusters in a mixture model' , Pattern Recognition Letters, 20, 267-272 https://doi.org/10.1016/S0167-8655(98)00144-5
  4. BIERNACKI, C., CELEUX, G. AND GOVAERT, G. (2000). 'Assessing a mixture model for clustering with the integrated completed likelihood', IEEE Transactions on Pattern Analysis and Machine Intelligence, 22, 719-725 https://doi.org/10.1109/34.865189
  5. CELEUX, G. AND SOROMENHO, G. (1996). 'An entropy criterion for assessing the number of clusters in a mixture model', Journal of Classification, 13, 195-212 https://doi.org/10.1007/BF01246098
  6. DIAS, J. G. (2004). 'Controlling the level of separation of components in Monte Carlo studies of latent class models', In Classification, Clustering, and Data Mining Applications (Banks, D., House, L., McMorris, F. R., Arabie, P. and Gaul, W., eds.), 77-84, Springer, Berlin
  7. DIAS, J. G. AND WEDEL, M. (2004). 'An empirical comparison of EM, SEM and MCMC performance for problematic Gaussian mixture likelihoods', Statistics and Computing, 14, 323-332 https://doi.org/10.1023/B:STCO.0000039481.32211.5a
  8. DUDA, R. O., HART, P. E. AND STORK, D. G. (2001). Pattern Classification, 2nd ed., Wiley-Interscience, New York
  9. FRIEDMAN, N., GEIGER, D. AND GOLDSZMIDT, M. (1997). 'Bayesian network classifiers', Machine Learning, 29, 131-163 https://doi.org/10.1023/A:1007465528199
  10. GOODMAN, L. A. (1974). 'Exploratory latent structure analysis using both identifiable and unidentifiable models', Biometrika, 61, 215-231 https://doi.org/10.1093/biomet/61.2.215
  11. HATHAWAY, R. J. (1986). 'Another interpretation of the EM algorithm for mixture distributions', Statistics & Probability Letters, 4, 53-56 https://doi.org/10.1016/0167-7152(86)90016-7
  12. McLACHLAN, G. AND PEEL, D. (2000). Finite Mixture Models, Wiley-Interscience, New York
  13. PEARL, J. (1988). Probabilistic Reasoning in Intelligent Systems, Morgan Kaufmann, San Mateo
  14. RISSANEN, J. (1987). 'Stochastic complexity', Journal of the Royal Statistical Society, Ser. B, 49, 223-239
  15. SCHWARZ, G. (1978). 'Estimating the dimension of a model', The Annals of Statistics, 6, 461-464 https://doi.org/10.1214/aos/1176344136
  16. TIERNEY, L. AND KADANE, J. B. (1986). 'Accurate approximations for posterior moments and marginal densities', Journal of the American Statistical Association, 81, 82-86 https://doi.org/10.2307/2287970