Browse > Article
http://dx.doi.org/10.11627/jksie.2022.45.1.071

Application of Asymmetric Support Vector Regression Considering Predictive Propensity  

Lee, Dongju (Department of Industrial & Systems Engineering, Kongju National University)
Publication Information
Journal of Korean Society of Industrial and Systems Engineering / v.45, no.1, 2022 , pp. 71-82 More about this Journal
Abstract
Most of the predictions using machine learning are neutral predictions considering the symmetrical situation where the predicted value is not smaller or larger than the actual value. However, in some situations, asymmetric prediction such as over-prediction or under-prediction may be better than neutral prediction, and it can induce better judgment by providing various predictions to decision makers. A method called Asymmetric Twin Support Vector Regression (ATSVR) using TSVR(Twin Support Vector Regression), which has a fast calculation time, was proposed by controlling the asymmetry of the upper and lower widths of the ε-tube and the asymmetry of the penalty with two parameters. In addition, by applying the existing GSVQR and the proposed ATSVR, prediction using the prediction propensities of over-prediction, under-prediction, and neutral prediction was performed. When two parameters were used for both GSVQR and ATSVR, it was possible to predict according to the prediction propensity, and ATSVR was found to be more than twice as fast in terms of calculation time. On the other hand, in terms of accuracy, there was no significant difference between ATSVR and GSVQR, but it was found that GSVQR reflected the prediction propensity better than ATSVR when checking the figures. The accuracy of under-prediction or over-prediction was lower than that of neutral prediction. It seems that using both parameters rather than using one of the two parameters (p_1,p_2) increases the change in the prediction tendency. However, depending on the situation, it may be better to use only one of the two parameters.
Keywords
Support Vector Regression; Machine Learning; Predictive Propensity;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Peng X., TSVR: An efficient Twin Support Vector Machine for regression, Neural Networks, 2010, Vol. 23, pp. 365-372.   DOI
2 Lee, D.J., Tool Lifecycle Optimization using ν- Asymmetric Support Vector Regression, Journal of the Society of Korea Industrial and Systems Engineering, 2020, Vol. 43, No. 4, pp. 208-216.   DOI
3 Singh, M. Chadha, J., Ahuja, P., and Jayadeva, Chandra, S., Reduced Twin Support Vector Regression, Neurocomputing, 2011, Vol. 74, pp. 1474-1477.   DOI
4 Vapnik, V., Statistical Learning Theory, New York, NY: Wiley, 1998.
5 Wang, H. and Xu, Y., Scaling up twin support vector regression with safe screening rule, Information Science, 2018, Vol. 465, pp. 174-190.   DOI
6 Wu, D., Jennins, C., Terpenny, J., Gao, R.X., and Kumara, S., A Comparative Study on Machine Learning Algorithms for Smart Manufacturing: Tool Wear Prediction Using Random Forests, Journal of Manu. Sci. and Engineering, 2017, Vol. 139, No. 7, pp. 1-10.
7 Xu, Y., Li, X., Pan, X., and Yang, Z., Asymmetric ν-twin support vector regression, Neural Comput. & Appli., 2018, Vol. 30, pp. 3799-3814.   DOI
8 Xu, Y. and Wang, L., A weighted twin support vector regression, Knowledge-Based Systems, 2012, Vol. 33, pp. 92-101.   DOI
9 Huang, X., Shi, L., Pelckmans, K., and Suykens, J., Asymmetric є-tube support vector regression, Computational Statistics and Data Analysis, 2014, Vol. 77, pp. 371-382.   DOI
10 Lee, D.J., and Choi, S.J., Generalized Support Vector Quantile Regression, Journal of the Society of Korea Industrial and Systems Engineering, 2020, Vol. 43, No. 4, pp. 107-115.   DOI
11 Wang, S., Huang, X., and Yan, Y., A neural network of smooth hinge functions, IEEE Trans. Neural Netw., 2010, Vol. 21, No. 9, pp. 1381-1395.   DOI
12 Wu, J., Wang, Y.-G., Tian, Y.-C., Burrage, K., and Cao, T., Support Vector Regression with Asymmetric Loss for Optimal Electric Load Forecasting, Energy, 2021, Vol. 223, 119969.   DOI