Browse > Article
http://dx.doi.org/10.3745/KTSDE.2018.7.9.351

Incremental Ensemble Learning for The Combination of Multiple Models of Locally Weighted Regression Using Genetic Algorithm  

Kim, Sang Hun ((재)한국화학융합시험연구원)
Chung, Byung Hee (숭실대학교 산업.정보시스템공학과)
Lee, Gun Ho (숭실대학교 산업.정보시스템공학과)
Publication Information
KIPS Transactions on Software and Data Engineering / v.7, no.9, 2018 , pp. 351-360 More about this Journal
Abstract
The LWR (Locally Weighted Regression) model, which is traditionally a lazy learning model, is designed to obtain the solution of the prediction according to the input variable, the query point, and it is a kind of the regression equation in the short interval obtained as a result of the learning that gives a higher weight value closer to the query point. We study on an incremental ensemble learning approach for LWR, a form of lazy learning and memory-based learning. The proposed incremental ensemble learning method of LWR is to sequentially generate and integrate LWR models over time using a genetic algorithm to obtain a solution of a specific query point. The weaknesses of existing LWR models are that multiple LWR models can be generated based on the indicator function and data sample selection, and the quality of the predictions can also vary depending on this model. However, no research has been conducted to solve the problem of selection or combination of multiple LWR models. In this study, after generating the initial LWR model according to the indicator function and the sample data set, we iterate evolution learning process to obtain the proper indicator function and assess the LWR models applied to the other sample data sets to overcome the data set bias. We adopt Eager learning method to generate and store LWR model gradually when data is generated for all sections. In order to obtain a prediction solution at a specific point in time, an LWR model is generated based on newly generated data within a predetermined interval and then combined with existing LWR models in a section using a genetic algorithm. The proposed method shows better results than the method of selecting multiple LWR models using the simple average method. The results of this study are compared with the predicted results using multiple regression analysis by applying the real data such as the amount of traffic per hour in a specific area and hourly sales of a resting place of the highway, etc.
Keywords
Locally Weighted Regression; Multi Model Selection; Incremental Ensemble Learning; Genetic Algorithm;
Citations & Related Records
연도 인용수 순위
  • Reference
1 V. Vapnik and L. Bottou, "All: local algorithms for pattern recognition and dependencies estimation," Neural Computation, Vol.5, No.6, pp.893-909, 1993.   DOI
2 C. Atkeson, A, Moore, and S. Schaal, "Locally weighted learning. Artificial Intelligence Review," Vol.11, No.1-5, pp.11-73, 1997.   DOI
3 H. Park, "A Study on the Construction of the Transaction- Based Real Estate Price Index Using Locally Weighted Regression(LWR) Model," Journal of the Korea Real Estate Analysis Association, Vol.17, No.1, pp.55-66, 2011.
4 M.-J. Jun, "Identification of Seoul's Employment Centers by Using Nonparametric Methods," Journal of Korea Planning Association, Vol.39, No.3, pp.69-83, 2003.
5 H. S. Lim, C. Oh, J. H. Park, and G. Q. Lee, "Study on Individual Vehicle Traveling Speed Filtering Method Using Locally Weighted Regression (LWR)," Journal of Korean Society of Transportation, Vol.59, pp.1094-1102, 2008.
6 J. K. Cho, D. E. Lee, S. O. Song, and E. S. Yoon, "Quality estimation Using Support Vector Machine based on locally weighted regression," Journal of the Korean Institute of Gas, Vol.10, pp.126-130, 2003.
7 J. Kim, J. Lee, S. Y. Kim, and B. H. Lee, "The Effects of Point Accumulation Effort Level on Redemption Behavior in Loyalty Program," Journal of Korean Marketing Association, Vol.27, pp.85-106 2012.
8 W. S. Cleveland, "Robust Locally Weighted Regression and Smoothing Scatterplots," Journal of the American Statistical Association, Vol.74 No.368 pp.829-836, 1979.   DOI
9 T. Toledo, H. Koutsopoulos, and K. Ahmed, "Estimation of vehicle trajectories with locally weighted regression," Journal of Transportation Research Board, Vol.1999, pp.161-169, 2007.   DOI
10 H. Sun, H. Liu, H. xiao, R. He, and B. Ran, "Use of Local Linear Regression Model for Short-Term Traffic Forecasting," Journal of Transportation Research Board, Vol.1836, pp.143-150, 2003.   DOI
11 F. Meier, P. Hennig, and S. Schaal, "Incremental local Gaussian regression," in Proceedings of Advances in Neural Information Processing Systems, Montreal, Vol.27, 2014.
12 B. Talgorn, C. Audet, M. Kokkolaras, and S. Le Digabel, "Locally weighted regression models for surrogate-assisted design optimization," Optimization and Engineering, Vol.19, Issue 1, pp.213-238, 2018.   DOI
13 S. Lee, "A Study on the Locally Weighted Regression," Journal of Applied Science, Vol.7, No.1, pp.121-129, 1998.   DOI
14 R. Polikar, S. Krause, and L. Burd, "Ensemble of Classifiers Based Incremental Learning with Dynamic Voting Weight Update," IEEE Xplore, Vol.4, pp.2770-2775, 2003.
15 Z. Erdem, R. Polikar, F. Gurgen, and N. Yumusak, "Ensemble of SVMs for Incremental Learning," In: Oza N.C., Polikar R., Kittler J., Roli F. (eds) Multiple Classifier Systems. MCS 2005. Lecture Notes in Computer Science, Vol.3541, pp.246- 256, 2005.
16 T. G. Dietterich, "Ensemble Methods in Machine Learning," in Multiple Classifier Systems, Lecture Notes in Computer Science, Vol.1857, pp.1-15, 2000.
17 Bagging, boosting and stacking in machine learning [Internet], https://stats.stackexchange.com/questions/18891/bagging-boosting-and-stacking-in-machine-learning