Browse > Article
http://dx.doi.org/10.7232/JKIIE.2016.42.4.263

Boosted Regression Method based on Rejection Limits for Large-Scale Data  

Kwon, Hyuk-Ho (Dept. of Mechanical convergence Engineering, Hanyang University)
Kim, Seung-Wook (Dept. of Mechanical convergence Engineering, Hanyang University)
Choi, Dong-Hoon (School of Mechanical Engineering, Hanyang University)
Lee, Kichun (Industrial Engineering, Hanyang University)
Publication Information
Journal of Korean Institute of Industrial Engineers / v.42, no.4, 2016 , pp. 263-269 More about this Journal
Abstract
The purpose of this study is to challenge a computational regression-type problem, that is handling large-size data, in which conventional metamodeling techniques often fail in a practical sense. To solve such problems, regression-type boosting, one of ensemble model techniques, together with bootstrapping-based re-sampling is a reasonable choice. This study suggests weight updates by the amount of the residual itself and a new error decision criterion which constructs an ensemble model of models selectively chosen by rejection limits. Through these ideas, we propose AdaBoost.RMU.R as a metamodeling technique suitable for handling large-size data. To assess the performance of the proposed method in comparison to some existing methods, we used 6 mathematical problems. For each problem, we computed the average and the standard deviation of residuals between real response values and predicted response values. Results revealed that the average and the standard deviation of AdaBoost.RMU.R were improved than those of other algorithms.
Keywords
Boosting; Large Data Metamodeling; Ensemble Learning; Regression;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Drucker, H. (1997), Improving regressors using boosting techniques, Proceedings of the 14th International Conference of Machine Learning.
2 Freund, Y. and Schapire, R. E. (1997), A decision-theoretic generalization of on-line learning and an application to boosting, Journal of computer and system sciences, 55(1), 119-139.   DOI
3 Gao, F., Kou, P., Gao, L., and Guan, X. (2013), Boosting regression methods based on a geometric conversion approach : using SVMs base learners, Neurocomputing, 113(3), 67-87.   DOI
4 Jin, R., Chen, W., and Simpson, T. W. (2001), Comparative Studies of Metamodeling Techniques under Multiple Modeling Criteria, Structural and Multidisciplinary Optimization, 23(1), 1-13.   DOI
5 Kodiyalam, S., Yang, R. J., and Gu, L. (2004), High-Performance Computing and Surrogate Modeling for Rapid Visualization with Multidisciplinary Optimization, AIAA Journal, 42(11), 2347-2354.   DOI
6 Madsen, K. and Zilinskas, J. (2000), Testing branch-and-bound methods for global optimization, IMM technical report, Technical University of Denmark.
7 Sasena, M. J. (2002), Flexibility and Efficiency Enhancement for Constrained Global Design Optimization with Kriging Approximations, PhD thesis, University of Michigan.
8 Park, C. I., Kim, Y. D., Kim, J. S., Song, J. W., and Choi, H. S. (2011), Data Mining with R, Kyowoosa.
9 Powell, M. J. D. (1987), Radial Basis Functions for Multivariable Interpolation : A review, Oxford University Press, 143-167.
10 Shrestha, D. L. and Solomatine, D. P. (2006), Experiments with Ada Boost.RT : an improved boosting scheme for regression, Neural computation, 18(7), 1678-1710.   DOI
11 Simpson, T. W., Toropov, V., Balabanov, V., and Viana, F. A. C. (2008), Design and Analysis of Computer Experiments in Multidisciplinary Design Optimization : A Review of How Far We Have Come-or Not, 12th AIAA/ISSMO Multidisciplinary and Optimization Conference.
12 Wang, G. G. and Shan, S. (2007), Review of Metamodeling Techniques in Support of Engineering Design Optimization, Journal of Mechanical Design, 129(4), 370-380.   DOI
13 Solomatine, D. P. and Shrestha, D. L. (2004), AdaBoost.RT : a boosting algorithm for regression problems, Proceedings of the International Joint Conference on Neural Networks.