DOI QR코드

DOI QR Code

One-Class Support Vector Learning and Linear Matrix Inequalities

  • Park, Jooyoung (Dept. of Control & Instrumentation Engineering, Korea University) ;
  • Kim, Jinsung (Dept. of Electrical Engineering, Korea University) ;
  • Lee, Hansung (Dept. of Computer and Information Science, Korea University) ;
  • Park, Daihee (Dept. of Computer and Information Science, Korea University)
  • Published : 2003.06.01

Abstract

The SVDD(support vector data description) is one of the most well-known one-class support vector learning methods, in which one tries the strategy of utilizing balls defined on the kernel feature space in order to distinguish a set of normal data from all other possible abnormal objects. The major concern of this paper is to consider the problem of modifying the SVDD into the direction of utilizing ellipsoids instead of balls in order to enable better classification performance. After a brief review about the original SVDD method, this paper establishes a new method utilizing ellipsoids in feature space, and presents a solution in the form of SDP(semi-definite programming) which is an optimization problem based on linear matrix inequalities.

Keywords

References

  1. N. Cristianinian J. Shawe-Taylor, An introduction tosupport vector machines and other kernel-based learningmethods, Cambridge University Press, 2000
  2. B. Sch$\ddot{o}$lkopf and A. J. Smola, Learning with kernels,MIT Press, 2002
  3. C. BishoP, 'NoveIty detection and neural networksvalidation, IEE Proceedings on Vision, Image, and Signal Processing, Special Issue on Apptications of Neural Networks, vol. 141, pp. 217-222, 1994
  4. D. Tax and R. Duin, 'Support Vector DomainDescription,' Pattern Recognition Letters, vol. 20, pp.1191-1199, 1999 https://doi.org/10.1016/S0167-8655(99)00087-2
  5. D. Tax, One-class classification, PhD Thesis, DelftUniversity of Technology, 2001
  6. B. Scholkopf, J. C. Platt, and A. J. Smola, Kernetmethod for percentite feature extraction, Technical Report MSR-TR-2000-22, Microso A Research, WA,2000
  7. B. Sch$\ddot{o}$lkopf, J. C. Platt, J. Shawe-TayIor, and A. J.Smola, and R. C. Williamson, 'Estimating the supportof a high-dimensional distribution,' Neural Computation,vol.13, pp. 1443-1471, 2001 https://doi.org/10.1162/089976601750264965
  8. G. Ratch, S. Mika, B. Sch$\ddot{o}$lkopf, and K.-R. M$\ddot{u}$ller,'Constructing boosting algorithms from SVMs: Anapplication to one-class classification,' IEEETransactions on Pattern AnaIysis and MachineIntelligence, vol. 24, pp. 1-15, 2002 https://doi.org/10.1109/34.982881
  9. C. Campbell and K. P. Bennett, 'A linear programmingapproach to novelty detection,' Advances of NIPS 2000,pp. 395-401, 2000
  10. D. Tax and P. Juszczak, 'Kemel whitening forone-class classification,' Pattern Recognition with Support Vector Machines, pp. 40-52, 2002
  11. S. Boyd, L. ElGhaoui, E. Feron and V. Balakrishnan, Linear matrix inequalities in systems and control theory, SIAM Studies in Apptied Mathematics, Vot. 15, SIAM,Philadelphia, 1994
  12. P. Gahinet, A. Nemirovski, A. J. Laub and M. Chilali, LMI control toolbox, Math Works Inc., Natick, MA,1995
  13. K. Tsuda, 'Support vector classifiers with asymmetrickernel functions,' Proceedings of ESANN, PP. 183-188,1999

Cited by

  1. Ship Detection Using Edge-Based Segmentation and Histogram of Oriented Gradient with Ship Size Ratio vol.15, pp.4, 2015, https://doi.org/10.5391/IJFIS.2015.15.4.251
  2. Some Observations for Portfolio Management Applications of Modern Machine Learning Methods vol.16, pp.1, 2016, https://doi.org/10.5391/IJFIS.2016.16.1.44