Browse > Article
http://dx.doi.org/10.11627/jkise.2020.43.4.107

Generalized Support Vector Quantile Regression  

Lee, Dongju (Industrial & Systems Engineering, Kongju National University)
Choi, Sujin (Department of Metal Mould Design, Korea Polytechnic)
Publication Information
Journal of Korean Society of Industrial and Systems Engineering / v.43, no.4, 2020 , pp. 107-115 More about this Journal
Abstract
Support vector regression (SVR) is devised to solve the regression problem by utilizing the excellent predictive power of Support Vector Machine. In particular, the ⲉ-insensitive loss function, which is a loss function often used in SVR, is a function thatdoes not generate penalties if the difference between the actual value and the estimated regression curve is within ⲉ. In most studies, the ⲉ-insensitive loss function is used symmetrically, and it is of interest to determine the value of ⲉ. In SVQR (Support Vector Quantile Regression), the asymmetry of the width of ⲉ and the slope of the penalty was controlled using the parameter p. However, the slope of the penalty is fixed according to the p value that determines the asymmetry of ⲉ. In this study, a new ε-insensitive loss function with p1 and p2 parameters was proposed. A new asymmetric SVR called GSVQR (Generalized Support Vector Quantile Regression) based on the new ε-insensitive loss function can control the asymmetry of the width of ⲉ and the slope of the penalty using the parameters p1 and p2, respectively. Moreover, the figures show that the asymmetry of the width of ⲉ and the slope of the penalty is controlled. Finally, through an experiment on a function, the accuracy of the existing symmetric Soft Margin, asymmetric SVQR, and asymmetric GSVQR was examined, and the characteristics of each were shown through figures.
Keywords
Epsilon Insensitive Loss Function; Machine Learning; Asymmetric Support Vector Regression;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Anagha P., Balasundaram, S., and Meena, Y., On robust twin support vector regression in primal using squared pinball loss, Journal of Intelligent and Fuzzy Systems, 2018, Vol. 35, No. 5, pp. 5231-5239.   DOI
2 Anand, P., Rastogi, R., and Chandra, S., A new asymmetric e-insensitive pinball loss function based support vector quantile regression model, Applied Soft Computing Journal, 2020, Vol. 94, pp. 1-14.
3 Balasundaram, S. and Prasad, S.C., On pairing Huber support vector regression, Applied Soft Computing Journal, 2020, Vol. 97, Part B, pp. 1-16.
4 Huang, X., Shi, L., and Suykens, J., Support Vector Machine Classifier with Pinball Loss, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, Vol. 36, No. 5, pp. 984-997.   DOI
5 Hwang, C.H., Asymmetric least squares regression estimation using weighted least squares support vector machine, Journal of the Korean Data and Information Science Society, 2011, Vol. 22, No. 5, pp. 999-1005.
6 Lee, J.Y. and Kim, J.G., Extracting Specific Information in Web Pages Using Machine Learning, Journal of Society of Korea Industrial and Systems Engineering, 2018, Vol. 41, No. 4, pp. 189-195.   DOI
7 Li, M.W., Hong, W.C., and Kang, H.G., Urban traffic flow forecasting using Gauss-SVR with cat mapping, cloud model and PSO hybrid algorithm, Neurocomputing, 2013, Vol. 99, pp. 230-240.   DOI
8 Smola, A.J. and Scholkopf, B., A tutorial on support vector regression, Statistics and Computing, 2004, Vol. 14, pp. 199-222.   DOI
9 Takeuchi, I., Sears, Q.V., and Smola, A.J., Nonparametric quantile estimation, Journal of Machine Learning Research, 2006, Vol. 7, No. 45, pp. 1231-1264.
10 Vapnik, V., Statistical Learning Theory, New York, NY : Wiley, 1998.
11 Xu, Y., Li, X., Pan, X., and Yang, Z., Asymmetric v-twin support vector regression, Neural Comput. and Applic., 2018, Vol. 30, pp. 3799-3814.   DOI
12 Wu, C.H., Ho, J.M., and Lee, D.T., Travel-Time Prediction with Support Vector Regression, IEEE Transactions on Intelligent Transportation Systems, 2004, Vol. 5, No. 4, pp. 276-281.   DOI