Browse > Article
http://dx.doi.org/10.5351/CKSS.2007.14.3.507

Multinomial Kernel Logistic Regression via Bound Optimization Approach  

Shim, Joo-Yong (Department of Applied Statistics, Catholic University of Daegu)
Hong, Dug-Hun (Department of Mathematics, Myongju University)
Kim, Dal-Ho (Department of Statistics, Kyungbuk National University)
Hwang, Chang-Ha (Division of Information and Computer Science, Dankook University)
Publication Information
Communications for Statistical Applications and Methods / v.14, no.3, 2007 , pp. 507-516 More about this Journal
Abstract
Multinomial logistic regression is probably the most popular representative of probabilistic discriminative classifiers for multiclass classification problems. In this paper, a kernel variant of multinomial logistic regression is proposed by combining a Newton's method with a bound optimization approach. This formulation allows us to apply highly efficient approximation methods that effectively overcomes conceptual and numerical problems of standard multiclass kernel classifiers. We also provide the approximate cross validation (ACV) method for choosing the hyperparameters which affect the performance of the proposed approach. Experimental results are then presented to indicate the performance of the proposed procedure.
Keywords
Approximate cross validation; hyperparameters; multinomial logistic regression; support vector machine;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95   DOI
2 Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multinomiallogistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968   DOI   ScienceOn
3 Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446   DOI
4 Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University
5 Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines, Proceeding of the International Joint Conference on Neural Networks, 900-903
6 Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York
7 Vapnik, V. N. (1998). Statistical Learning Theory. Springer-Verlag, New York
8 Wahba, G., Lin, Y., and Zhang, H. (1999). Generalized approximate cross validation for support vector machine, or, another way to look at margin-Like quantities. Technical Report No. 1006, University of Wisconsin
9 Weston, J. and Watkins, C. (1998). Multi-class SVM. Technical Report 98-04, Royal Holloway University of London
10 Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141
11 Blake, C. L. and Merz, C. J. (1998). UCI Repository of machine learning databases. University of California, Department of Information and Computer Science. Available from: http://www.ics.ucLedu/ mlearn/MLRepository.html
12 Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200   DOI
13 Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematic, 31, 317-403