Browse > Article
http://dx.doi.org/10.5351/CKSS.2006.13.2.449

Pruning the Boosting Ensemble of Decision Trees  

Yoon, Young-Joo (Department of Statistics, Seoul National University)
Song, Moon-Sup (Department of Statistics, Seoul National University)
Publication Information
Communications for Statistical Applications and Methods / v.13, no.2, 2006 , pp. 449-466 More about this Journal
Abstract
We propose to use variable selection methods based on penalized regression for pruning decision tree ensembles. Pruning methods based on LASSO and SCAD are compared with the cluster pruning method. Comparative studies are performed on some artificial datasets and real datasets. According to the results of comparative studies, the proposed methods based on penalized regression reduce the size of boosting ensembles without decreasing accuracy significantly and have better performance than the cluster pruning method. In terms of classification noise, the proposed pruning methods can mitigate the weakness of AdaBoost to some degree.
Keywords
AdaBoost; Penalized regression; Cluster pruning; LASSO; SCAD; Pruning ensemble;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984). Classification and Regression Trees, Chapman and Hall, New York
2 Margineantu, D.D. and Dietterich, T.G. (1997). Pruning adaptive boosting. Proceedings of the 14th International Conference in Machine Learning, 211-218
3 Mason, L., Baxter, J., Bartlett, P.L. and Frean, M. (2000). Functional gradient techniques for combining hypotheses, In A. J. Smola, P. L. Bartlett, B. Scholkopf and D. Schuurmans, editors. Advances in Large Margin Classifiers, Cambridge: MIT press
4 Merz, C.J. and Murphy, P.M. (1998). DCI Repository of Machine Learning database. Available at http://www.ics.uci.edu/-mlearn/MLRepository.html
5 Quinlan, J.R. (1993). C4.5 : Programs for Machine Learning, Morgan Kaufmann, San Maeto, CA
6 Quinlan, J.R. (1996). Bagging, boosting, and C4.5. Proceeding of 13th National Conference on Artificial Intelligence, 725-730
7 Rosset, S., Zhu, J. and Hastie, T. (2004). Boosting as a regularized path to a maximum margin classifier. Journal of Machine Learning Research, Vol. 5, 941-973
8 Tamon, C. and Xiang, J. (2000). On the boosting pruning problem. Proceedings of 11th European Conference on Machine Learning, Lecture Notes in Computer Science, Vol. 1810, 404-412
9 Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of Royal Statistical Society B, Vol. 58, 267-288
10 Tibshirani, R. and Knight, K. (1999). Model selection and inference by bootstrap 'bumping'. Journal of Computational and Graphical Statistics, Vol. 8, 671-686   DOI   ScienceOn
11 Breiman, L. (1996). Bagging predictors. Machine Learning, Vol. 24, 123-140
12 Breiman, L. (1998). Arcing classifiers (with discussion). Annals of Statistics, Vol. 26, 801-849   DOI
13 Friedman, J. (2001). Greedy function approximation: a gradient boosting machine. Annals of Statistics, Vol. 29, 1189-1232
14 Dietterich, T.G. (2000). An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, Vol. 40, 139-157   DOI
15 Fan, J, and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, Vol. 96, 1348-1360   DOI   ScienceOn
16 Freund, Y. and Schapire, R. E. (1997). A decision-theoretic generalization of online learning and application to boosting. Journal of Computer and System Science, Vol. 55, 119-139   DOI   ScienceOn
17 Hastie, T., Tibshirani, R. and Friedman, J.H. (2001). Elements of Statistical Learning. Springer-Verlag, New York
18 Heskes, T. (1997). Balancing between bagging and bumping, In Mozer, M., Jordan, M., and Petsche, T. editors. Advances in Neural Information Processing, Morgan Kaufmann
19 Lazarevic, A. and Obradovic, Z. (2001). The effective pruning of neural network ensembles. Proceedings of 2001 IEEE/INNS International Joint Conierence on Neural Networks, 796-801