Browse > Article

Ensemble learning of Regional Experts  

Lee, Byung-Woo (서강대학교 컴퓨터공학과)
Yang, Ji-Hoon (서강대학교 컴퓨터공학과)
Kim, Seon-Ho (서강대학교 컴퓨터공학과)
Abstract
We present a new ensemble learning method that employs the set of region experts, each of which learns to handle a subset of the training data. We split the training data and generate experts for different regions in the feature space. When classifying a data, we apply a weighted voting among the experts that include the data in their region. We used ten datasets to compare the performance of our new ensemble method with that of single classifiers as well as other ensemble methods such as Bagging and Adaboost. We used SMO, Naive Bayes and C4.5 as base learning algorithms. As a result, we found that the performance of our method is comparable to that of Adaboost and Bagging when the base learner is C4.5. In the remaining cases, our method outperformed the benchmark methods.
Keywords
Ensemble Learning; Boosting; Bagging; Region Experts;
Citations & Related Records
연도 인용수 순위
  • Reference
1 T. G. Dietterich, "Ensemble method in machine learning," LNCS, Vol.1857, pp. 1-15, 2000
2 E. Bauer and R. Kohavi, “An empirical comparison of voting classification algorithm: bagging, boost-ing, and variants,” Machine Learning, Vol.36, No.1-2, pp. 105-142, 1999   DOI
3 D. Optiz and R. Maclin, “Popular ensemble methods: an empirical study,” Journal of AIR, Vol.11, pp. 169-198, 1999
4 J. Quinlan, "Induction of Decision Tree," Machine Learning, Vol.1, No.1, pp. 81-106, 1986   DOI
5 Y. Freund and R. Schapire, “A decision theoretic generalization of online learning and an application to boosting,” Journal of CSC, Vol.55, pp. 119-139, 1997   DOI   ScienceOn
6 L. Breiman, “Bagging predictors,” Machine Learn-ing, Vol.24, No.2, pp. 123-140, 1996   DOI
7 T. G. Dietterich, “An experimental comparison of three methods for constructing ensembles of deci-sion trees: bagging, boosting, and randomization,” Machine Learning, Vol.40, No.2, pp. 139-157, 2000   DOI
8 L. Hansen and P. Salamon, “Neural network ensembles,” IEEE Trans. PAMI, Vol.12, pp. 993-1001, 1990   DOI   ScienceOn
9 T. Evgeniou, L. Perez-Breva, M. Pontil and T. Poggio, "Bound on the generalization performance of kernel machine ensembles," Proc. ICML, pp. 271-278, 2000
10 J. Platt, "Fast training of support vector machines using sequential minimal optimization,” in Advances in Kernel Methods, ed. Scholkopf. B., Burges, C., Smola, A., The MIT Press, pp. 185-208, 1999
11 G. Valentini, M. Muselli and F. Ruffino, “Bagged Ensembles of SVMs for Gene Expression Data Ana-lysis,” The IEEE-INNS-ENNS International Joint Conference on Neural Networks, pp. 1844-1849, 2003   DOI
12 C. Blacke and C. Merz, UCI Repository of Machine Learning Database, http://www.ics.uc.ed/~mlearn/LRepository.html, 1998
13 J. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann, 1993
14 R. O. Duda, P. E. Hart and D. G. Stork, 2nd ed., Pattern Classification, Wiley-interscience, 2000
15 I. Buciu, C. Kotropoulos and I. Pitas, “Combining support vector machines for accuracy face detec-tion," Proc. ICIP, 99. 1054-1057, 2001
16 I. Witten and E. Frank, 2nd ed., Data Mining: Practical Machine Learning Tools and Techniques with Java Implementation, Morgan Kaufmann, San Francisco, 2005
17 Y. Freund and O. Schapire, "Experiments with a new boosting algorithm," Proc. 13th International Conf. on Machine Learning, pp. 148-156. 1996