1 |
Vapnik, V. (1998). Statistical learning theory, Wiley, New York.
|
2 |
Shim, J., Park, H. J. and Seok, K. H. (2009). Variance function estimation with LS-SVM for replicated data. Journal of Korean Data & Information Science Society, 20, 925-931.
|
3 |
Silverman, B. (1986). Density estimation for statistics and data analysis, Chapman and Hall, New York.
|
4 |
Smola, A. J. and Scholkopf, B. (2004). A tutorial on support vector regression. Statistics and Computing, 14, 199-222.
DOI
|
5 |
Suykens, J. A. K. and Vandewalle, J. (1999). Least squares support vector machine classifiers. Neural Processing Letters, 9, 293-300.
DOI
ScienceOn
|
6 |
Suykens, J. A. K., Gastel, T. V., Bravanter, J. D., Moore, B. D. and Vandewalle, J. (2002). Least squares support vector machines, World Scientific.
|
7 |
Tax, D. and Duin, R. (1999). Support vector domain description. Pattern Recognition Letters, 20, 1191-1199.
DOI
ScienceOn
|
8 |
Vapnik, V. (1995). The nature of statistical learning theory, Springer, New York.
|
9 |
Parzen, E. (1962). On the estimation of a probability density function and the mode. Annals of Mathematical Statistics, 33, 1065-1076.
DOI
ScienceOn
|
10 |
Hwang, H. (2010). Fixed size LS-SVM for multiclassification problems of large data sets. Journal of Korean Data & Information Science Society, 21, 1561-567.
|
11 |
Scholkopf, B. and Smola, A. (2002). Learning with kernels- Support vector machines, regularization, optimizations, and beyond, MIT Press, Cambridge, MA.
|
12 |
Seok, K. H. (2010). Semi-supervised classification with LS-SVM formulation. Journal of Korean Data & Information Science Society, 21, 461-470.
|
13 |
Shim, J. and Lee, J. T. (2009). Kernel method for autoregressive data. Journal of Korean Data & Information Science Society, 20, 949-964.
|
14 |
Herbrich, R. (2002). Learning kernel classifiers- Theory and algorithms, MIT Press, Cambridge, MA.
|
15 |
Duda, R. O. and Hart, P. E. (1973). Pattern classification and scene analysis, Wiley, New York.
|