Browse > Article
http://dx.doi.org/10.7236/IJIBC.2019.11.1.47

Ensemble Methods Applied to Classification Problem  

Kim, ByungJoo (Department of Computer Engineering, Youngsan University)
Publication Information
International Journal of Internet, Broadcasting and Communication / v.11, no.1, 2019 , pp. 47-53 More about this Journal
Abstract
The idea of ensemble learning is to train multiple models, each with the objective to predict or classify a set of results. Most of the errors from a model's learning are from three main factors: variance, noise, and bias. By using ensemble methods, we're able to increase the stability of the final model and reduce the errors mentioned previously. By combining many models, we're able to reduce the variance, even when they are individually not great. In this paper we propose an ensemble model and applied it to classification problem. In iris, Pima indian diabeit and semiconductor fault detection problem, proposed model classifies well compared to traditional single classifier that is logistic regression, SVM and random forest.
Keywords
Ensemble model; Decision trees; Bagging; Overfitting;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 L. Hyafil and L. Rivest R., "Constructing optimal binary decision trees is NP-complete," Information Processing Letters, Vol. 5, No.1 pp. 15-17, 1976.   DOI
2 H. Zantema, and H. L. Bodlaender, "Finding Small Equivalent Decision Trees is Hard," International Journal of Foundations of Computer Science, Vol. 11, No. 2, pp. 343-354, 2000.   DOI
3 G.E.Naumov, "NP-completeness of problems of construction of optimal decision trees," Soviet Physics Vol. 36, No. 4, pp.270-271, 1991.
4 J.R. Quinlan, "Induction of decision trees, Machine Learning," Vol. 1, pp.81-106, 1986.   DOI
5 Available : https://www.navodayaengg.in
6 L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and Regression Trees, Wadsworth Int. Group, 1984.
7 Available : https://archive.ics.uci.edu/ml/datasets/iris
8 Available : http://archive.ics.uci.edu/ml/datasets/Wine
9 J. Quinlan, C4.5 1st Edition Programs for Machine Learning, Morgan Kaufmann, 2014.
10 B.J. Kim, "Model Selection in Artificial Neural Networks," International Journal of Advanced smart Convergence, Vol. 7 No.4, pp.66-74,2018. DOI: https://doi.org/10.7236/IJASC.2018.7.4   DOI