• Title/Summary/Keyword: SVM (Support Vector Method)

Search Result 652, Processing Time 0.033 seconds

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.

Fuzzy One Class Support Vector Machine (퍼지 원 클래스 서포트 벡터 머신)

  • Kim, Ki-Joo;Choi, Young-Sik
    • Journal of Internet Computing and Services
    • /
    • v.6 no.3
    • /
    • pp.159-170
    • /
    • 2005
  • OC-SVM(One Class Support Vector Machine) avoids solving a full density estimation problem, and instead focuses on a simpler task, estimating quantiles of a data distribution, i.e. its support. OC-SVM seeks to estimate regions where most of data resides and represents the regions as a function of the support vectors, Although OC-SVM is powerful method for data description, it is difficult to incorporate human subjective importance into its estimation process, In order to integrate the importance of each point into the OC-SVM process, we propose a fuzzy version of OC-SVM. In FOC-SVM (Fuzzy One-Class Support Vector Machine), we do not equally treat data points and instead weight data points according to the importance measure of the corresponding objects. That is, we scale the kernel feature vector according to the importance measure of the object so that a kernel feature vector of a less important object should contribute less to the detection process of OC-SVM. We demonstrate the performance of our algorithm on several synthesized data sets, Experimental results showed the promising results.

  • PDF

A transductive least squares support vector machine with the difference convex algorithm

  • Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.2
    • /
    • pp.455-464
    • /
    • 2014
  • Unlabeled examples are easier and less expensive to obtain than labeled examples. Semisupervised approaches are used to utilize such examples in an eort to boost the predictive performance. This paper proposes a novel semisupervised classication method named transductive least squares support vector machine (TLS-SVM), which is based on the least squares support vector machine. The proposed method utilizes the dierence convex algorithm to derive nonconvex minimization solutions for the TLS-SVM. A generalized cross validation method is also developed to choose the hyperparameters that aect the performance of the TLS-SVM. The experimental results conrm the successful performance of the proposed TLS-SVM.

Performance Analysis of Kernel Function for Support Vector Machine (Support Vector Machine에 대한 커널 함수의 성능 분석)

  • Sim, Woo-Sung;Sung, Se-Young;Cheng, Cha-Keon
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.405-407
    • /
    • 2009
  • SVM(Support Vector Machine) is a classification method which is recently watched in mechanical learning system. Vapnik, Osuna, Platt etc. had suggested methodology in order to solve needed QP(Quadratic Programming) to realize SVM so that have extended application field. SVM find hyperplane which classify into 2 class by converting from input space converter vector to characteristic space vector using Kernel Function. This is very systematic and theoretical more than neural network which is experiential study method. Although SVM has superior generalization characteristic, it depends on Kernel Function. There are three category in the Kernel Function as Polynomial Kernel, RBF(Radial Basis Function) Kernel, Sigmoid Kernel. This paper has analyzed performance of SVM against kernel using virtual data.

  • PDF

A Study on the Performance Enhancement of Face Detection using SVM (SVM을 이용한 얼굴 검출 성능 향상에 대한 연구)

  • Lee Chi-Ceun;Jung Sung-Tae
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.2
    • /
    • pp.330-337
    • /
    • 2005
  • This paper proposes a method which improves the performance of face detection by using SVM(Support Vector Machine). first, it finds face region candidates by using AdaBoost based object detection method which selects a small number of critical features from a larger set. Next it classifies if the candidate is a face or non-face by using SVM(Support Vector Machine). Experimental results shows that the proposed method improve accuracy of face detection in comparison with existing method.

Design of Robust Support Vector Machine Using Genetic Algorithm (유전자 알고리즘을 이용한 강인한 Support vector machine 설계)

  • Lee, Hee-Sung;Hong, Sung-Jun;Lee, Byung-Yun;Kim, Eun-Tai
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.3
    • /
    • pp.375-379
    • /
    • 2010
  • The support vector machine (SVM) has been widely used in variety pattern recognition problems applicable to recommendation systems due to its strong theoretical foundation and excellent empirical successes. However, SVM is sensitive to the presence of outliers since outlier points can have the largest margin loss and play a critical role in determining the decision hyperplane. For robust SVM, we limit the maximum value of margin loss which includes the non-convex optimization problem. Therefore, we proposed the design method of robust SVM using genetic algorithm (GA) which can solve the non-convex optimization problem. To demonstrate the performance of the proposed method, we perform experiments on various databases selected in UCI repository.

On the Fuzzy Membership Function of Fuzzy Support Vector Machines for Pattern Classification of Time Series Data (퍼지서포트벡터기계의 시계열자료 패턴분류를 위한 퍼지소속 함수에 관한 연구)

  • Lee, Soo-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.6
    • /
    • pp.799-803
    • /
    • 2007
  • In this paper, we propose a new fuzzy membership function for FSVM(Fuzzy Support Vector Machines). We apply a fuzzy membership to each input point of SVM and reformulate SVM into fuzzy SVM (FSVM) such that different input points can make different contributions to the learning of decision surface. The proposed method enhances the SVM in reducing the effect of outliers and noises in data points. This paper compares classification and estimated performance of SVM, FSVM(1), and FSVM(2) model that are getting into the spotlight in time series prediction.

A Study on Image Classification using Hybrid Method (하이브리드 기법을 이용한 영상 식별 연구)

  • Park, Sang-Sung;Jung, Gwi-Im;Jang, Dong-Sik
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.6 s.44
    • /
    • pp.79-86
    • /
    • 2006
  • Classification technology is essential for fast retrieval in large multi-media database. This paper proposes a combining GA(Genetic Algorithm) and SVM(Support Vector Machine) model to fast retrieval. We used color and texture as feature vectors. We improved the retrieval accuracy by using proposed model which retrieves an optimal feature vector set in extracted feature vector sets. The first performance test was executed for the performance of color, texture and the feature vector combined with color and texture. The second performance test, was executed for performance of SVM and proposed algorithm. The results of the experiment, using the feature vector combined color and texture showed a good Performance than a single feature vector and the proposed algorithm using hybrid method also showed a good performance than SVM algorithm.

  • PDF

A Novel Image Classification Method for Content-based Image Retrieval via a Hybrid Genetic Algorithm and Support Vector Machine Approach

  • Seo, Kwang-Kyu
    • Journal of the Semiconductor & Display Technology
    • /
    • v.10 no.3
    • /
    • pp.75-81
    • /
    • 2011
  • This paper presents a novel method for image classification based on a hybrid genetic algorithm (GA) and support vector machine (SVM) approach which can significantly improve the classification performance for content-based image retrieval (CBIR). Though SVM has been widely applied to CBIR, it has some problems such as the kernel parameters setting and feature subset selection of SVM which impact the classification accuracy in the learning process. This study aims at simultaneously optimizing the parameters of SVM and feature subset without degrading the classification accuracy of SVM using GA for CBIR. Using the hybrid GA and SVM model, we can classify more images in the database effectively. Experiments were carried out on a large-size database of images and experiment results show that the classification accuracy of conventional SVM may be improved significantly by using the proposed model. We also found that the proposed model outperformed all the other models such as neural network and typical SVM models.

Tuning the Architecture of Support Vector Machine: The Case of Bankruptcy Prediction

  • Min, Jae-H.;Jeong, Chul-Woo;Kim, Myung-Suk
    • Management Science and Financial Engineering
    • /
    • v.17 no.1
    • /
    • pp.19-43
    • /
    • 2011
  • Tuning the architecture of SVM (support vector machine) is to build an SVM model of better performance. Two different tuning methods of the grid search and the GA (genetic algorithm) have been addressed in the literature, each of which has its own methodological pros and cons. This paper suggests a combined method for tuning the architecture of SVM models, which employs the GAM (generalized additive models), the grid search, and the GA in sequence. The GAM is used for selecting input variables, and the grid search and the GA are employed for finding optimal parameter values of the SVM models. Applying the method to a bankruptcy prediction problem, we show that SVM model tuned by the proposed method outperforms other SVM models.