Browse > Article

WHEN CAN SUPPORT VECTOR MACHINE ACHIEVE FAST RATES OF CONVERGENCE?  

Park, Chang-Yi (Institute of Statistics, Korea University)
Publication Information
Journal of the Korean Statistical Society / v.36, no.3, 2007 , pp. 367-372 More about this Journal
Abstract
Classification as a tool to extract information from data plays an important role in science and engineering. Among various classification methodologies, support vector machine has recently seen significant developments. The central problem this paper addresses is the accuracy of support vector machine. In particular, we are interested in the situations where fast rates of convergence to the Bayes risk can be achieved by support vector machine. Through learning examples, we illustrate that support vector machine may yield fast rates if the space spanned by an adopted kernel is sufficiently large.
Keywords
Classification; empirical process; hinge loss; statistical learning theory;
Citations & Related Records

Times Cited By Web Of Science : 0  (Related Records In Web of Science)
연도 인용수 순위
  • Reference
1 BARTLETT, P. AND SHAWE-TAYLOR, J. (1998). 'Generalization performance of support vector machines and other pattern classifiers', In Advances in Kernel Methods: Support Vector Learning (Scholkopf, B., Burges, C. J. C. and Smola, A. J., eds.), 43-54, MIT Press, Cambridge, USA
2 CORTES, C. AND VAPNIK, V. (1995). 'Support-vector networks', Machine Learning, 20, 273-297
3 WAHBA, G. (1990). Spline Models for Observational Data, Society for Industrial and Applied Mathematics, Philadelphia
4 MAMMEN, E. AND TSYBAKOV, A. B. (1999). 'Smooth discrimination analysis', The Annals of Statistics, 27, 1808-1829   DOI
5 STEINWART, I. AND SCOVEL, C. (2007). 'Fast rates for support vector machines using Gaussian kernels', The Annals of Statistics, 35, 575-607   DOI
6 MERCER, J. (1909). 'Functions of positive and negative type, and their connection with the theory of integral equations', Philosophical Transactions of the Royal Society of London, Ser. A, 209, 415-446   DOI
7 ZHOU, D.-X. (2002). 'The covering number in learning theory', Journal of Complexity, 18, 739-767   DOI   ScienceOn
8 PARK, C. (2006). 'Convergence rates of generalization errors for margin-based classification', preprint
9 ZHANG, T. (2004). 'Statistical behavior and consistency of classification methods based on convex risk minimization', The Annals of Statistics, 32, 56-85   DOI
10 KOLMOGOROV, A. N. AND TIKHOMIROV, V. M. (1959). '$\varepsilon$-entropy and $\varepsilon$-capacity of sets in a functional spaces', Uspekhi Mat. Nauk, 14, 3-86. In Russian. English Translations in American Society Translations, 17, 277-364 (1961)
11 VAN DER VAART, A. W. AND WELLNER, J. A. (1996). Weak Convergence and Empirical Processes: with Applications to Statistics, Springer-Verlag, New York
12 BLANCHARD, G., BOUSQUET, O. AND MASSART, P. (2004). 'Statistical performance of support vector machines' , preprint
13 KOLMOGOROV, A. N. AND TIKHOMIROV, V. M. (1959). '$\varepsilon$-entropy and $\varepsilon$-capacity of sets in a functional spaces', Uspekhi Mat. Nauk, 14, 3-86. In Russian. English Translations in American Society Translations, 17, 277-364 (1961)
14 BARTLETT, P. L., JORDAN, M. I. AND McAULIFFE, J. D. (2006). 'Convexity, classification and risk bounds', Journal of the American Statistical Association, 101, 138-156   DOI   ScienceOn