• Title/Summary/Keyword: support vector

Search Result 2,298, Processing Time 0.032 seconds

A Note on Support Vector Density Estimation with Wavelets

  • Lee, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.2
    • /
    • pp.411-418
    • /
    • 2005
  • We review support vector and wavelet density estimation. The relationship between support vector and wavelet density estimation in reproducing kernel Hilbert space (RKHS) is investigated in order to use wavelets as a variety of support vector kernels in support vector density estimation.

  • PDF

Prediction Performance of Hybrid Least Square Support Vector Machine with First Principle Knowledge (First Principle을 결합한 최소제곱 Support Vector Machine의 예측 능력)

  • 김병주;심주용;황창하;김일곤
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.7_8
    • /
    • pp.744-751
    • /
    • 2003
  • A hybrid least square Support Vector Machine combined with First Principle(FP) knowledge is proposed. We compare hybrid least square Support Vector Machine(HLS-SVM) with early proposed models such as Hybrid Neural Network(HNN) and HNN with Extended Kalman Filter(HNN-EKF). In the training and validation stage HLS-SVM shows similar performance with HNN-EKF but better than HNN, whereas, in the testing stage, it shows three times better than HNN-EKF, hundred times better than HNN model.

A Differential Evolution based Support Vector Clustering (차분진화 기반의 Support Vector Clustering)

  • Jun, Sung-Hae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.5
    • /
    • pp.679-683
    • /
    • 2007
  • Statistical learning theory by Vapnik consists of support vector machine(SVM), support vector regression(SVR), and support vector clustering(SVC) for classification, regression, and clustering respectively. In this algorithms, SVC is good clustering algorithm using support vectors based on Gaussian kernel function. But, similar to SVM and SVR, SVC needs to determine kernel parameters and regularization constant optimally. In general, the parameters have been determined by the arts of researchers and grid search which is demanded computing time heavily. In this paper, we propose a differential evolution based SVC(DESVC) which combines differential evolution into SVC for efficient selection of kernel parameters and regularization constant. To verify improved performance of our DESVC, we make experiments using the data sets from UCI machine learning repository and simulation.

Fuzzy One Class Support Vector Machine (퍼지 원 클래스 서포트 벡터 머신)

  • Kim, Ki-Joo;Choi, Young-Sik
    • Journal of Internet Computing and Services
    • /
    • v.6 no.3
    • /
    • pp.159-170
    • /
    • 2005
  • OC-SVM(One Class Support Vector Machine) avoids solving a full density estimation problem, and instead focuses on a simpler task, estimating quantiles of a data distribution, i.e. its support. OC-SVM seeks to estimate regions where most of data resides and represents the regions as a function of the support vectors, Although OC-SVM is powerful method for data description, it is difficult to incorporate human subjective importance into its estimation process, In order to integrate the importance of each point into the OC-SVM process, we propose a fuzzy version of OC-SVM. In FOC-SVM (Fuzzy One-Class Support Vector Machine), we do not equally treat data points and instead weight data points according to the importance measure of the corresponding objects. That is, we scale the kernel feature vector according to the importance measure of the object so that a kernel feature vector of a less important object should contribute less to the detection process of OC-SVM. We demonstrate the performance of our algorithm on several synthesized data sets, Experimental results showed the promising results.

  • PDF

WHEN CAN SUPPORT VECTOR MACHINE ACHIEVE FAST RATES OF CONVERGENCE?

  • Park, Chang-Yi
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.3
    • /
    • pp.367-372
    • /
    • 2007
  • Classification as a tool to extract information from data plays an important role in science and engineering. Among various classification methodologies, support vector machine has recently seen significant developments. The central problem this paper addresses is the accuracy of support vector machine. In particular, we are interested in the situations where fast rates of convergence to the Bayes risk can be achieved by support vector machine. Through learning examples, we illustrate that support vector machine may yield fast rates if the space spanned by an adopted kernel is sufficiently large.

A Study on the Pattern Recognition Using Support Vector Fuzzy Inference System (Support Vector Fuzzy Inference System을 이용한 Pattern Recognition 에 관한 연구)

  • 김용균;정은화
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 2003.05b
    • /
    • pp.374-379
    • /
    • 2003
  • 본 논문에서는 pattern recognition을 위하여 support vector fuzzy inference system을 제안하였다 Fuzzy inference system의 structure와 parameter를 identification 하기 위하여 Support vector machine을 이용하였으며 에러 최소화 기법으로는 gradient descent 방법을 사용하였다. 제안된 SVFIS 방법의 성능을 파악하고자 COIL 이미지를 이용한 3차원 물체 인식 실험을 수행하였다.

  • PDF

Development of Fuzzy Support Vector Machine and Evaluation of Performance Using Ionosphere Radar Data (Fuzzy Twin Support Vector Machine 개발 및 전리층 레이더 데이터를 통한 성능 평가)

  • Cheon, Min-Kyu;Yoon, Chang-Yong;Kim, Eun-Tai;Park, Mig-Non
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.4
    • /
    • pp.549-554
    • /
    • 2008
  • Support Vector machine is the classifier which is based on the statistical training theory. Twin Support Vector Machine(TWSVM) is a kind of binary classifier that determines two nonparallel planes by solving two related SVM-type problems. The training time of TWSVM is shorter than that of SVM, but TWSVM doesn't shows worse performance than that of SVM. This paper proposes the TWSVM which is applied fuzzy membership, and compares the performance of this classifier with the other classifiers using Ionosphere radar data set.

Design of SVM-Based Polynomial Neural Networks Classifier Using Particle Swarm Optimization (입자군집 최적화를 이용한 SVM 기반 다항식 뉴럴 네트워크 분류기 설계)

  • Roh, Seok-Beom;Oh, Sung-Kwun
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.8
    • /
    • pp.1071-1079
    • /
    • 2018
  • In this study, the design methodology as well as network architecture of Support Vector Machine based Polynomial Neural Network, which is a kind of the dynamically generated neural networks, is introduced. The Support Vector Machine based polynomial neural networks is given as a novel network architecture redesigned with the aid of polynomial neural networks and Support Vector Machine. The generic polynomial neural networks, whose nodes are made of polynomials, are dynamically generated in each layer-wise. The individual nodes of the support vector machine based polynomial neural networks is constructed as a support vector machine, and the nodes as well as layers of the support vector machine based polynomial neural networks are dynamically generated as like the generation process of the generic polynomial neural networks. Support vector machine is well known as a sort of robust pattern classifiers. In addition, in order to enhance the structural flexibility as well as the classification performance of the proposed classifier, multi-objective particle swarm optimization is used. In other words, the optimization algorithm leads to sequentially successive generation of each layer of support vector based polynomial neural networks. The bench mark data sets are used to demonstrate the pattern classification performance of the proposed classifiers through the comparison of the generalization ability of the proposed classifier with some already studied classifiers.

Development of Fuzzy Support Vector Machine for Pattern Classification (패턴 분류를 위한 Fuzzy Twin Support Vector machine 개발)

  • Cheon, Min-Gyu;Yun, Chang-Yong;Kim, Eun-Tae;Park, Min-Yong
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.11a
    • /
    • pp.279-282
    • /
    • 2007
  • Support Vector Machine(SVM)은 통계적 학습 이론에 기반을 둔 분류기이다. 또한 Twin Support Vector Machine(TWSVM)은 이진 SVM 분류기의 한 종류로써, 서로 관련된 두 개의 SVM 유형 문제를 통해 평행하지 않은 두 개의 평면을 결정하고 이 두 평면을 통해 분류기를 완성하는 방식이다. 이러한 방식은 TWSVM은 학습 시간이 SVM에 비해 훨씬 짧으며, SVM과 비교하여 떨어지지 않는 성능을 보여준다. 본 논문은 분류기 입력에 Fuzzy Memvership을 적용하는 방식의 TWSVM을 제안하고, 2차원 벡터 입력에 대한 실험을 통하여 기존에 제시 되었던 TWSVM과 비교한다.

  • PDF

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.