References
- Asuncion, A. and Newman, D. J. (2007). UCI machine learning repository. University of California, Irvine, School of Information and Computer Science, http://www.ics.uci.edu/ mlearn/MLRepository.html.
- Breiman, L. (1996). Bagging predictors. Machine Learning, 26, 123-140.
- Choi, J. S., Lee, S. H. and Cho, H. J. (2010). A study for improving data mining methods for continuous response variables. Journal of the Korean Data & Information Science Society, 21(5), 917-926.
- Clemen, R. (1989). Combining forecasts: A review and annotated bibliography. Journal of Forecasting, 5, 559-583. https://doi.org/10.1016/0169-2070(89)90012-5
- Freund, Y. (1995). Boosting a weak learning algorithm by majority. Information and Computation, 121, 256-285. https://doi.org/10.1006/inco.1995.1136
- Freund, Y. and Schapire, R. (1996). Game theory, on-line prediction and boosting. Proceedings of the Ninth Annual Conference on Computational Learning Theory, 325-332.
- Freund, Y. and Schapire, R. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55, 119-139. https://doi.org/10.1006/jcss.1997.1504
- Friedman, J., Hastie, T. and Tibshirani, R. (2000). Additive logistic regression: A statistical view of boosting (with discussion). Annals of statistics, 28, 337-407.
- Hastie, T., Tibshirani, R. and Friedman, J. (2001). The elements of statistical learning: Data mining, inference, and prediction, Springer, New York.
- Heinz, G., Peterson, L. J., Johnson, R.W. and Kerk, C. J., (2003). Exploring relationships in body dimensions. Journal of Statistics Education, 11, http://www.amstat.org/ publications/jse/v11n2/datasets.heinz.html.
- Jung, Y. H., Eo, S. H., Moon, H. S. and Cho, H. J. (2010). A study for improving the performance of data mining using ensemble techniques. Journal of the Korean Data & Information Science Society, 21, 917-926.
- Kearns, M. and Valiant, L. G. (1994). Cryptographic limitations on learning Boolean formulae and finite automata. Journal of the Association for Computing Machinery, 41, 67-95. https://doi.org/10.1145/174644.174647
- Kim, H. and Loh, W.-Y. (2001). Classication trees with unbiased multiway splits. Journal of the American Statistical Association, 96, 589-604. https://doi.org/10.1198/016214501753168271
- Kim, H. and Loh, W.-Y. (2003). Classication trees with bivariate linear discriminant node models. Journal of Computational and Graphical Statistics, 12, 512-530. https://doi.org/10.1198/1061860032049
- Kim, H., Kim, H., Moon, H. and Ahn, H. (2010). A weight-adjusted voting algorithm for ensemble of classifiers. Journal of the Korean Statistical Society, 40 437-439.
- Loh, W.-Y. (2002). Regression trees with unbiased variable selection and interaction detection. Statistica Sinica, 12, 361-386.
- Loh, W.-Y. (2009). Improving the precision of classication trees. The Annals of Applied Statistics, 3, 1710- 1737. https://doi.org/10.1214/09-AOAS260
- Perrone, M. (1993). Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization, Ph.D Dissertation, Department of Physics, Brown University.
- Schapire, R. E. (1990). The strength of weak learnability. Machine Learning, 5, 197-227.
- Schapire, R. E. and Singer, Y. (1999). Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37, 297-336. https://doi.org/10.1023/A:1007614523901
- Statlib. (2010). Datasets archive. Carnegie Mellon University, Department of Statistics, http://lib.stat.cmu.edu.
- Terhune, J. M., (1994). Geographical variation of harp seal underwater vocalisations. Canadian Journal of Zoology, 72, 892-897. https://doi.org/10.1139/z94-121
- Valiant, L. G. (1984). A theory of the learnable. Communication of the ACM, 27, 1134-1142. https://doi.org/10.1145/1968.1972
- Wolpert, D. (1992). Stacked generalization. Neural Network, 5, 241-259. https://doi.org/10.1016/S0893-6080(05)80023-1
- Zhu, J., Zou, H., Rosset, S. and Hastie, T. (2009). Multi-class Adaboost, Statistics and its Interface, 2, 349-360. https://doi.org/10.4310/SII.2009.v2.n3.a8
Cited by
- The study of foreign exchange trading revenue model using decision tree and gradient boosting vol.24, pp.1, 2013, https://doi.org/10.7465/jkdi.2013.24.1.161
- Tree size determination for classification ensemble vol.27, pp.1, 2016, https://doi.org/10.7465/jkdi.2016.27.1.255
- Comparison of ensemble pruning methods using Lasso-bagging and WAVE-bagging vol.25, pp.6, 2014, https://doi.org/10.7465/jkdi.2014.25.6.1371
- Comparison of data mining methods with daily lens data vol.24, pp.6, 2013, https://doi.org/10.7465/jkdi.2013.24.6.1341