• 제목/요약/키워드: Generalized Cross-Validation

검색결과 77건 처리시간 0.025초

Estimating Variance Function with Kernel Machine

  • Kim, Jong-Tae;Hwang, Chang-Ha;Park, Hye-Jung;Shim, Joo-Yong
    • Communications for Statistical Applications and Methods
    • /
    • 제16권2호
    • /
    • pp.383-388
    • /
    • 2009
  • In this paper we propose a variance function estimation method based on kernel trick for replicated data or data consisted of sample variances. Newton-Raphson method is used to obtain associated parameter vector. Furthermore, the generalized approximate cross validation function is introduced to select the hyper-parameters which affect the performance of the proposed variance function estimation method. Experimental results are then presented which illustrate the performance of the proposed procedure.

A transductive least squares support vector machine with the difference convex algorithm

  • Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제25권2호
    • /
    • pp.455-464
    • /
    • 2014
  • Unlabeled examples are easier and less expensive to obtain than labeled examples. Semisupervised approaches are used to utilize such examples in an eort to boost the predictive performance. This paper proposes a novel semisupervised classication method named transductive least squares support vector machine (TLS-SVM), which is based on the least squares support vector machine. The proposed method utilizes the dierence convex algorithm to derive nonconvex minimization solutions for the TLS-SVM. A generalized cross validation method is also developed to choose the hyperparameters that aect the performance of the TLS-SVM. The experimental results conrm the successful performance of the proposed TLS-SVM.

A Simulation Study on Regularization Method for Generating Non-Destructive Depth Profiles from Angle-Resolved XPS Data

  • Ro, Chul-Un
    • 분석과학
    • /
    • 제8권4호
    • /
    • pp.707-714
    • /
    • 1995
  • Two types of regularization method (singular system and HMP approaches) for generating depth-concentration profiles from angle-resolved XPS data were evaluated. Both approaches showed qualitatively similar results although they employed different numerical algorithms. The application of the regularization method to simulated data demonstrates its excellent utility for the complex depth profile system. It includes the stable restoration of the depth-concentration profiles from the data with considerable random error and the self choice of smoothing parameter that is imperative for the successful application of the regularization method. The self choice of smoothing parameter is based on generalized cross-validation method which lets the data themselves choose the optimal value of the parameter.

  • PDF

SVQR with asymmetric quadratic loss function

  • Shim, Jooyong;Kim, Malsuk;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권6호
    • /
    • pp.1537-1545
    • /
    • 2015
  • Support vector quantile regression (SVQR) can be obtained by applying support vector machine with a check function instead of an e-insensitive loss function into the quantile regression, which still requires to solve a quadratic program (QP) problem which is time and memory expensive. In this paper we propose an SVQR whose objective function is composed of an asymmetric quadratic loss function. The proposed method overcomes the weak point of the SVQR with the check function. We use the iterative procedure to solve the objective problem. Furthermore, we introduce the generalized cross validation function to select the hyper-parameters which affect the performance of SVQR. Experimental results are then presented, which illustrate the performance of proposed SVQR.

Robust varying coefficient model using L1 regularization

  • Hwang, Changha;Bae, Jongsik;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제27권4호
    • /
    • pp.1059-1066
    • /
    • 2016
  • In this paper we propose a robust version of varying coefficient models, which is based on the regularized regression with L1 regularization. We use the iteratively reweighted least squares procedure to solve L1 regularized objective function of varying coefficient model in locally weighted regression form. It provides the efficient computation of coefficient function estimates and the variable selection for given value of smoothing variable. We present the generalized cross validation function and Akaike information type criterion for the model selection. Applications of the proposed model are illustrated through the artificial examples and the real example of predicting the effect of the input variables and the smoothing variable on the output.

Kernel Poisson Regression for Longitudinal Data

  • Shim, Joo-Yong;Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제19권4호
    • /
    • pp.1353-1360
    • /
    • 2008
  • An estimating procedure is introduced for the nonlinear mixed-effect Poisson regression, for longitudinal study, where data from different subjects are independent whereas data from same subject are correlated. The proposed procedure provides the estimates of the mean function of the response variables, where the canonical parameter is related to the input vector in a nonlinear form. The generalized cross validation function is introduced to choose optimal hyper-parameters in the procedure. Experimental results are then presented, which indicate the performance of the proposed estimating procedure.

  • PDF

Feature selection in the semivarying coefficient LS-SVR

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제28권2호
    • /
    • pp.461-471
    • /
    • 2017
  • In this paper we propose a feature selection method identifying important features in the semivarying coefficient model. One important issue in semivarying coefficient model is how to estimate the parametric and nonparametric components. Another issue is how to identify important features in the varying and the constant effects. We propose a feature selection method able to address this issue using generalized cross validation functions of the varying coefficient least squares support vector regression (LS-SVR) and the linear LS-SVR. Numerical studies indicate that the proposed method is quite effective in identifying important features in the varying and the constant effects in the semivarying coefficient model.

선형 평활스플라인 함수 추정과 적용 (A Linear Smoothing Spline Estimation and Applications)

  • 윤용화;김경무;김종태
    • Journal of the Korean Data and Information Science Society
    • /
    • 제9권1호
    • /
    • pp.29-36
    • /
    • 1998
  • 본 논문은 Eubank (1994, 1997)에 의해 이론적으로 제안된 선형 평활스플라인 추정량에 대한 알고리즘을 개발함으로 선형 스플라인의 추정을 보다 쉽고 효율적으로 사용할 수 있도록 하는데 목적이 있다. 이 알고리즘을 이용하여 여러가지 모형의 예들에 대하여 추정량의 적합성을 조사하였고, 제시된 선형 평활스플라인 추정량이 비모수 함수 추정의 도구로서 잘 적합됨을 알 수 있었다.

  • PDF

Two-step LS-SVR for censored regression

  • Bae, Jong-Sig;Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권2호
    • /
    • pp.393-401
    • /
    • 2012
  • This paper deals with the estimations of the least squares support vector regression when the responses are subject to randomly right censoring. The estimation is performed via two steps - the ordinary least squares support vector regression and the least squares support vector regression with censored data. We use the empirical fact that the estimated regression functions subject to randomly right censoring are close to the true regression functions than the observed failure times subject to randomly right censoring. The hyper-parameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation function. Experimental results are then presented which indicate the performance of the proposed procedure.

Sparse Kernel Regression using IRWLS Procedure

  • Park, Hye-Jung
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권3호
    • /
    • pp.735-744
    • /
    • 2007
  • Support vector machine(SVM) is capable of providing a more complete description of the linear and nonlinear relationships among random variables. In this paper we propose a sparse kernel regression(SKR) to overcome a weak point of SVM, which is, the steep growth of the number of support vectors with increasing the number of training data. The iterative reweighted least squares(IRWLS) procedure is used to solve the optimal problem of SKR with a Laplacian prior. Furthermore, the generalized cross validation(GCV) function is introduced to select the hyper-parameters which affect the performance of SKR. Experimental results are then presented which illustrate the performance of the proposed procedure.

  • PDF