Browse > Article
http://dx.doi.org/10.7465/jkdi.2012.23.2.375

Study on the ensemble methods with kernel ridge regression  

Kim, Sun-Hwa (Department of Statistics, Pusan National University)
Cho, Dae-Hyeon (Department of Data Science, Inje University)
Seok, Kyung-Ha (Department of Data Science, Inje University)
Publication Information
Journal of the Korean Data and Information Science Society / v.23, no.2, 2012 , pp. 375-383 More about this Journal
Abstract
The purpose of the ensemble methods is to increase the accuracy of prediction through combining many classifiers. According to recent studies, it is proved that random forests and forward stagewise regression have good accuracies in classification problems. However they have great prediction error in separation boundary points because they used decision tree as a base learner. In this study, we use the kernel ridge regression instead of the decision trees in random forests and boosting. The usefulness of our proposed ensemble methods was shown by the simulation results of the prostate cancer and the Boston housing data.
Keywords
Boosting; ensemble method; forward stagewise regression; kernel ridge regression; random forest;
Citations & Related Records
Times Cited By KSCI : 4  (Citation Analysis)
연도 인용수 순위
1 Breiman, L. (2001). Random forests. Machine Learning Journal, 45, 5-32.   DOI   ScienceOn
2 Cho, D. (2010). Mixed-effects LS-SVR for longitudinal data. Journal of the Korean Data & Information Science Society, 21, 363-369.
3 Cho, D., Shim, J. and Seok, K. H. (2010). Doubly penalized kernel method for heteroscedastic autoregressive data. Journal of the Korean Data & Information Science Society, 21, 155-162.
4 Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Annals of Statistics, 32, 407-451.   DOI
5 Freund, Y. and Schapire, R. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55,119-139.   DOI   ScienceOn
6 Hastie, T., Taylor, J., Tibshirani, R. and Walther. G. (2007). Forward stagewise regression and the monotone lasso. Electronic Journal of Statistics, 1, 1-29.   DOI
7 Hwang, H. (2010). Variable selection for multiclassification by LS-SVM. Journal of the Korean Data & Information Science Society, 21, 959-965.
8 Shim, J. (2011). Variable selection in the kernel Cox regression. Journal of the Korean Data & Information Science Society, 22, 795-801.
9 Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B, 58, 267-288.
10 Breiman, L. (1996). Bagging predictors. Machine Learning Journal, 26, 123-140.