References
- Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation. Numerishe Mathematik, 31, 377-390.
- Gunn, S. (1998). Support vector machines for classiflcation and regression. ISIS Technical Report, University of Southhampton.
- Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society, Series A, 209, 415-446. https://doi.org/10.1098/rsta.1909.0016
- Oh, K., Shim, J. and Kim, D. (2003). Incremental multi-classification by least squares support vector machine. Journal of Korean Data & Information Science Society, 14, 965-974.
- Perez-Cruz, F., Navia-Vazquez, A., Alarcon-Diana, P. L. and Artes-Rodriguez, A. (2000). An IRWLS procedure for SVR. In Proceedings of European Association for Signal Processing, EUSIPO 2000, Tampere, Finland.
- Platt, J. (1998). Sequential minimal optimization: A fast algorithm for training support vector machines. Microsoft Research Technical Report, MSR-TR-98-14.
- Seok, K. H. (2007). Semi-supervised learning using kernel estimation. Journal of Korean Data & Information Science Society, 18, 629-636.
- Smola, A. and Scholkopf, B. (1998). On a kernel-based method for pattern recognition, regression, approximation and operator inversion. Algorithmica, 22, 211-231. https://doi.org/10.1007/PL00013831
- Tipping, M. E. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1, 211-244. https://doi.org/10.1162/15324430152748236
- Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
- Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
- Williams, P. M. (1995). Bayesian regularization and pruning using a laplace prior. Neural Computation, 7, 117-143. https://doi.org/10.1162/neco.1995.7.1.117