DOI QR코드

DOI QR Code

Support vector expectile regression using IRWLS procedure

  • 투고 : 2014.06.05
  • 심사 : 2014.07.01
  • 발행 : 2014.07.31

초록

In this paper we propose the iteratively reweighted least squares procedure to solve the quadratic programming problem of support vector expectile regression with an asymmetrically weighted squares loss function. The proposed procedure enables us to select the appropriate hyperparameters easily by using the generalized cross validation function. Through numerical studies on the artificial and the real data sets we show the effectiveness of the proposed method on the estimation performances.

키워드

참고문헌

  1. Cole, T. J. and Green, P. J. (1992). Smoothing reference centile curves: The LMS method and penalized likelihood. Statistics in Medicine, 11, 1305-1319. https://doi.org/10.1002/sim.4780111005
  2. Craven, P. andWahba, G. (1979). Smoothing noisy data with spline functions : Estimating the correct degree of smoothing by the method of generalized cross-validation. Numerical Mathematics, 31, 377-403.
  3. Flake, G. W. and Lawrence, S. (2002). Ecient SVM regression training with SMO. Machine Learning, 46, 271-290. https://doi.org/10.1023/A:1012474916001
  4. Hwang, C. (2010). M-quantile regression using kernel machine technique. Journal of the Korean Data & Information Science Society, 21, 973-981.
  5. Koenker, R. and Bassett. G. (1978). Regression quantile. Econometrica, 46, 33-50. https://doi.org/10.2307/1913643
  6. Koenker, R. and Hallock, K. F. (2001). Quantile regression. Journal of Economic Perspectives, 40, 122-142.
  7. Kuhn, H. and Tucker, A. (1951). Nonlinear programming. In Proceedings of 2nd Berlekey Symposium on Mathematical Statistics and Probabilistics, University of California Press, CA, 481-492.
  8. Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 209, 415-446. https://doi.org/10.1098/rsta.1909.0016
  9. Newey, W. K. and Powell, J. L. (1987). Asymmetric least squares estimation and testing. Econometrica, 55, 819-847. https://doi.org/10.2307/1911031
  10. Perez-Cruz, F., Navia-Vazquez, A., Alarcon-Diana, P. L. and Artes-Rodriguez, A. (2000). An IRWLS procedure for SVR. In Proceedings of European Association for Signal Processing, EUSIPO 2000, Tampere, Finland.
  11. Platt, J. (1999). Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods-Support Vector Learning, edited by B. Scholkopf, C. J. C. Burges and A. J. Smola, MIT Press, Cambridge, MA, 185-208.
  12. Schnabel, S. K. and Eilers, P. H. C. (2009). Optimal expectile smoothing. Computational Statistics & Data Analysis, 53, 4168-4177. https://doi.org/10.1016/j.csda.2009.05.002
  13. Shim, J. and Hwang, C. (2013). Expected shortfall estimation using kernel machines. Journal of the Korean Data & Information Science Society, 24, 625-636. https://doi.org/10.7465/jkdi.2013.24.3.625
  14. Smola, A. and Scholkopf, B. (1998). On a kernel-based method for pattern recognition, regression, approximation and operator inversion. Algorithmica, 22, 211-231. https://doi.org/10.1007/PL00013831
  15. Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
  16. Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
  17. Wang, L. (Ed.) (2005). Support vector machines: Theory and application, Springer, New York.
  18. Wang, Y., Wang, S. and Lai, K. (2011). Measuring financial risk with generalized asymmetric least squares regression. Applied Soft Computing, 11, 5793-5800. https://doi.org/10.1016/j.asoc.2011.02.018
  19. Yu, K., Lu, Z. and Stander, J. (2003). Quantile regression: Applications and current research area. The Statistician, 52, 331-350.