Browse > Article
http://dx.doi.org/10.5351/CKSS.2009.16.6.997

An Algorithm for Support Vector Machines with a Reject Option Using Bundle Method  

Choi, Ho-Sik (Department of Informational Statistics and Institute of Basic Science, Hoseo University)
Kim, Yong-Dai (Department of Statistics, Seoul National University)
Han, Sang-Tae (Department of Informational Statistics and Institute of Basic Science, Hoseo University)
Kang, Hyun-Cheol (Department of Informational Statistics and Institute of Basic Science, Hoseo University)
Publication Information
Communications for Statistical Applications and Methods / v.16, no.6, 2009 , pp. 997-1004 More about this Journal
Abstract
A standard approach is to classify all of future observations. In some cases, however, it would be desirable to defer a decision in particular for observations which are hard to classify. That is, it would be better to take more advanced tests rather than to make a decision right away. This motivates a classifier with a reject option that reports a warning for those observations that are hard to classify. In this paper, we present the method which gives efficient computation with a reject option. Some numerical results show strong potential of the propose method.
Keywords
Classification; reject option; support vector machines; bundle method;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Lendgrebe, C. W., Tax, M. J. and Duin, P. W. (2006). The interaction between classification and reject performance for distance-based reject-option lassifiers, Pattern Recognition Letters, 27, 908–917
2 Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics, 6, 461–464
3 Teo, C. H., Le, Q., Smola, A. and Vishwanathan, S. V. N. (2007). A scalable modular convex solver for regularized risk minimization, International Conference on Knowledge Discovery and Data Mining archive Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining, 727–736   DOI
4 Teo, C. H., Vishwanathan, S. V. N., Smola, A. and Le, Q. (2009). Bundle methods for regularized risk minimization, Journal of Machine Learning Research, To appear
5 Tibshirani, R. (1996). Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society: Series B, 58, 267–288
6 Yukinawa, N., Oba, S., Kato, K., Taniguchi, K., Iwao-Koizumi, K., Tamaki, Y., Noguchi, S. and Ishii, S. (2006). A multi-class predictor based on a probabilistic model: Application to gene expression profiling-based diagnosis of thyroid tumors, BMC Bioinformatics, 7, 1471–2164
7 Zhu, J., Rosset, S., Hastie, T. and Tibshirani, R. (2004). 1-norm support vector machines, In Thrun,S. et al. (eds). Advances in Neural Information Processing Systems, 16, MIT Press, Cambridge, MA
8 Hastie, T., Tibshirani, R. and Friedman, J. (2001). The Elements of Statistical Learning, First Edition, Springer-verlag, New York
9 Herbei, R. and Wegkamp, M. H. (2006). Classification with reject option, The Canadian Journal of Statistics, 34, 709–721   DOI   ScienceOn
10 Bartlett, P. andWegkamp, M. H. (2008). Classification with a reject option using a hinge loss, Journal of Machine Learning Research, 9, 1823–1840
11 Chow, C. K. (1970). On optimum recognition error and reject tradeoff, IEEE Transactions on Information Theory, 16, 41–46
12 Cortes, C. and Vapnik, V. (1995). Support-vector networks, Machine Learning, 20, 273–297   DOI