Browse > Article
http://dx.doi.org/10.7465/jkdi.2013.24.2.391

GACV for partially linear support vector regression  

Shim, Jooyong (Department of Data Science, Inje University)
Seok, Kyungha (Department of Data Science, Inje University)
Publication Information
Journal of the Korean Data and Information Science Society / v.24, no.2, 2013 , pp. 391-399 More about this Journal
Abstract
Partially linear regression is capable of providing more complete description of the linear and nonlinear relationships among random variables. In support vector regression (SVR) the hyper-parameters are known to affect the performance of regression. In this paper we propose an iterative reweighted least squares (IRWLS) procedure to solve the quadratic problem of partially linear support vector regression with a modified loss function, which enables us to use the generalized approximate cross validation function to select the hyper-parameters. Experimental results are then presented which illustrate the performance of the partially linear SVR using IRWLS procedure.
Keywords
Generalized approximate cross validation function; iterative reweighted least squares procedure; partially linear regression; support vector regression;
Citations & Related Records
Times Cited By KSCI : 3  (Citation Analysis)
연도 인용수 순위
1 Cho, D. H., Shim, J. and Seok, K. H. (2010). Doubly penalized kernel method for heteroscedastic autoregressive data. Journal of the Korean Data & Information Science Society, 21, 155-162.
2 Hwang, H. (2010). Fixed size LS-SVM for multiclassi cation problems of large datasets. Journal of the Korean Data & Information Science Society, 21, 561-567.
3 Kuhn, H. W. and Tucker, A. W. (1951). Nonlinear programming. Proceedings of 2nd Berkeley Symposium, 481-492.
4 Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 415-446.
5 Nychka, D., Gray, G., Haaland, P., Martin, D. and O'Connell, M. (1995). A nonparametric approach syringe grading for quality improvement. Journal of American Statistical Association, 432, 1171-1178.
6 Perez-Cruz, F., Navia-Vazquez, A., Alarcon-Diana, P. L. and Artes-Rodriguez, A. (2000). An IRWLS procedure for SVR. In Proceedings of European Association for Signal Processing, EUSIPO 2000, Tampere, Finland.
7 Wang, L.(Ed.) (2005). Support vector machines: Theory and application, Springer, New York.
8 Platt, J. (1998). Sequential minimal optimization: A fast algorithm for training support vector machines, Technical Report MSR-TR-98-14, Microsoft Research, California.
9 Shim, J., Kim, C. and Hwang, C. (2011). Semiparametric least squares support vector machine for accelerated failure time model. Journal of the Korean Statistical Society, 40, 75-83.   과학기술학회마을   DOI   ScienceOn
10 Smola, A. J. and Scholkopf, B. (1998). On a kernel-based method for pattern recognition, regression, approximation and operator inversion. Algorithmica, 22, 211-231.   DOI
11 Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
12 Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
13 Wahba, G., Lin, Y. and Zhang, H. (1999). Generalized approximate cross validation for support vector machines, or another way to look at margin-like quantities, Technical Report 1006, University of Wisconsin, Wisconsin.
14 Yuan, M. (2006). GACV for quantile smoothing splines. Computational Statistics and Data Analysis, 50, 813-829.   DOI   ScienceOn