• 제목/요약/키워드: Cross validation function

검색결과 128건 처리시간 0.031초

GLOBAL MINIMA OF LEAST SQUARES CROSS VALIDATION FOR A SYMMETRIC POLYNOMIAL KEREL WITH FINITE SUPPORT

  • Jung, Kang-Mo;Kim, Byung-Chun
    • Journal of applied mathematics & informatics
    • /
    • 제3권2호
    • /
    • pp.183-192
    • /
    • 1996
  • The least squares cross validated bandwidth is the mini-mizer of the corss validation function for choosing the smooth parame-ter of a kernel density estimator. It is a completely automatic method but it requires inordinate amounts of computational time. We present a convenient formula for calculation of the cross validation function when the kernel function is a symmetric polynomial with finite sup-port. Also we suggest an algorithm for finding global minima of the crass validation function.

교차타당성을 이용한 확률밀도함수의 불연속점 추정의 띠폭 선택 (Bandwidth selections based on cross-validation for estimation of a discontinuity point in density)

  • 허집
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권4호
    • /
    • pp.765-775
    • /
    • 2012
  • 교차타당성은 커널추정량의 평활모수인 띠폭의 선택 방법으로 흔히 활용되고 있다. 연속인 확률밀도함수의 커널추정량의 띠폭 선택으로 널리 쓰이는 교차타당성 방법으로는 최대가능도교차타당성과 더불어 최소제곱교차타당성과 편의교차타당성이 있다. 확률밀도함수가 하나의 불연속점을 가질 때, Huh (2012)는 불연속점 추정을 위한 커널추정량의 띠폭 선택으로 최대가능도교차타당성을 이용한 방법을 제시하였다. 본 연구에서는 Huh (2012)에 의해 최대가능도교차타당성으로 제안된 띠폭선택의 방법과 같이 한쪽방향커널함수를 이용한 최소제곱교차타당성과 편의교차타당성으로 띠폭 선택 방법을 제시하고, 이들 띠폭 선택 방법들과 Huh (2012)의 최대가능도교차타당성을 이용한 띠폭 선택 방법을 모의실험을 통하여 비교연구 하고자 한다.

Multiclass LS-SVM ensemble for large data

  • Hwang, Hyungtae
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권6호
    • /
    • pp.1557-1563
    • /
    • 2015
  • Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.

SVC with Modified Hinge Loss Function

  • Lee, Sang-Bock
    • Journal of the Korean Data and Information Science Society
    • /
    • 제17권3호
    • /
    • pp.905-912
    • /
    • 2006
  • Support vector classification(SVC) provides more complete description of the linear and nonlinear relationships between input vectors and classifiers. In this paper we propose to solve the optimization problem of SVC with a modified hinge loss function, which enables to use an iterative reweighted least squares(IRWLS) procedure. We also introduce the approximate cross validation function to select the hyperparameters which affect the performance of SVC. Experimental results are then presented which illustrate the performance of the proposed procedure for classification.

  • PDF

Sparse kernel classication using IRWLS procedure

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • 제20권4호
    • /
    • pp.749-755
    • /
    • 2009
  • Support vector classification (SVC) provides more complete description of the lin-ear and nonlinear relationships between input vectors and classifiers. In this paper. we propose the sparse kernel classifier to solve the optimization problem of classification with a modified hinge loss function and absolute loss function, which provides the efficient computation and the sparsity. We also introduce the generalized cross validation function to select the hyper-parameters which affects the classification performance of the proposed method. Experimental results are then presented which illustrate the performance of the proposed procedure for classification.

  • PDF

e-SVR using IRWLS Procedure

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.1087-1094
    • /
    • 2005
  • e-insensitive support vector regression(e-SVR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the quadratic problem of e-SVR with a modified loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of e-SVR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for e-SVR.

  • PDF

Mixed-effects LS-SVR for longitudinal dat

  • Cho, Dae-Hyeon
    • Journal of the Korean Data and Information Science Society
    • /
    • 제21권2호
    • /
    • pp.363-369
    • /
    • 2010
  • In this paper we propose a mixed-effects least squares support vector regression (LS-SVR) for longitudinal data. We add a random-effect term in the optimization function of LS-SVR to take random effects into LS-SVR for analyzing longitudinal data. We also present the model selection method that employs generalized cross validation function for choosing the hyper-parameters which affect the performance of the mixed-effects LS-SVR. A simulated example is provided to indicate the usefulness of mixed-effect method for analyzing longitudinal data.

LS-SVM for large data sets

  • Park, Hongrak;Hwang, Hyungtae;Kim, Byungju
    • Journal of the Korean Data and Information Science Society
    • /
    • 제27권2호
    • /
    • pp.549-557
    • /
    • 2016
  • In this paper we propose multiclassification method for large data sets by ensembling least squares support vector machines (LS-SVM) with principal components instead of raw input vector. We use the revised one-vs-all method for multiclassification, which is one of voting scheme based on combining several binary classifications. The revised one-vs-all method is performed by using the hat matrix of LS-SVM ensemble, which is obtained by ensembling LS-SVMs trained using each random sample from the whole large training data. The leave-one-out cross validation (CV) function is used for the optimal values of hyper-parameters which affect the performance of multiclass LS-SVM ensemble. We present the generalized cross validation function to reduce computational burden of leave-one-out CV functions. Experimental results from real data sets are then obtained to illustrate the performance of the proposed multiclass LS-SVM ensemble.

Cox proportional hazard model with L1 penalty

  • Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권3호
    • /
    • pp.613-618
    • /
    • 2011
  • The proposed method is based on a penalized log partial likelihood of Cox proportional hazard model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log partial likelihood function of Cox proportional hazard model. It provide the ecient computation including variable selection and leads to the generalized cross validation function for the model selection. Experimental results are then presented to indicate the performance of the proposed procedure.