DOI QR코드

DOI QR Code

Nu-SVR Learning with Predetermined Basis Functions Included

정해진 기저함수가 포함되는 Nu-SVR 학습방법

  • 김영일 (고려대학교 제어계측공학과) ;
  • 조원희 (고려대학교 제어계측공학과) ;
  • 박주영 (고려대학교 제어계측공학과)
  • Published : 2003.06.01

Abstract

Recently, support vector learning attracts great interests in the areas of pattern classification, function approximation, and abnormality detection. It is well-known that among the various support vector learning methods, the so-called no-versions are particularly useful in cases that we need to control the total number of support vectors. In this paper, we consider the problem of function approximation utilizing both predetermined basis functions and a no-version support vector learning called $\nu-SVR$. After reviewing $\varepsilon-SVR$, $\nu-SVR$, and a semi-parametric approach, this paper presents an extension of the conventional $\nu-SVR$ method toward the direction that can utilize Predetermined basis functions. Moreover, the applicability of the presented method is illustrated via an example.

최근들어, 서포트 벡터 학습은 패턴 분류, 함수 근사 및 비정상 상태 탐지 등의 분야에서 상당한 관심을 끌고 있다. 여러가지 서포트 벡터 학습 방법들 중 누-버전(nu-versions)으로 불리는 방법들은 서포트 벡터의 개수를 제어해야할 필요가 있는 경우에는 특히 유용한 것으로 알려져 있다. 본 논문에서는, $\nu-SVR$로 불리는 누-버전 서포트 벡터 학습 방법과 미리 정해진 기저함수를 모두 활용하는 함수 근사 문제를 고려한다. $\varepsilon-SVR$, $\nu-SVR$ 및 세미-파라메트릭 함수 근사 방법론등을 복습한 후에, 본 논문은 정해진 기저함수를 이용할 수 있는 방향으로 기존의 $\nu-SVR$ 방법을 확장하는 방안을 제시한다. 그리고, 제안된 방법의 적용가능성이 예제를 통하여 보여진다.

Keywords

References

  1. N. Cristianini an J. Shawe-Taylor, An introduction to support vector machines and other kernel-based learning methods, Cambridge University Press, 2000.
  2. B. Scholkopf and A. J. Smola, Learning with kernels, MlT Press, 2002.
  3. K.-R. Muller, S. Mika, G. Ratsch, K. Tsuda, and B. Scholkopf, "An introduction to kernel-based learning algorithms," IEEE Transactions on Neural Networks, vol. 12, no. 2, pp. 181-201, 2001. https://doi.org/10.1109/72.914517
  4. F. Girosi, "An equivalence between sparse approximation and support vector machines," Neural Computation, vol. 10, no. 6, pp. 1455-1480, 1998. https://doi.org/10.1162/089976698300017269
  5. J. T. Kwok, "The evidence framework applied to support vector machines," IEEE Transactions on Neural Networks, vol. 11, no. 5, pp. 1162-1173, 2000. https://doi.org/10.1109/72.870047
  6. C. Burges, "A tutorial on support vector machines for pattern recognition," Knowledge Discovery and Data Mining, vol. 2, no. 2, pp. 121-167, 1998. https://doi.org/10.1023/A:1009715923555
  7. A. Smola and B. Scholkopf, "A tutorial on support vector regression," Neuro Colt Technical Report NC-TR-1998-030, Royal Holloway College, University of London, UK, 1998.
  8. B. Scholkopf, J. C. Platt, J. Shawe-Taylor, and A. J. Smola, and R. C. Williamson, "Estimating the support of a high-dimensional distribution," Neural Computation, vol. 13, pp. 1443-1471, 2001. https://doi.org/10.1162/089976601750264965
  9. B. Scholkopf, A. Smola, R. Williamson, and P. L. Bartlett. "New support vector algorithms," Neural Computation, vol. 12, no. 5, pp. 1207-1245, 2000. https://doi.org/10.1162/089976600300015565
  10. A. Smola, T. $Frie\beta$, and B. Scholkopf, "Semiparametric Support Vector and Linear Programming Machines," Nuero COLT Technical Report Series NC2-TR-1998-024, August, 1998.