• 제목/요약/키워드: Generalized Cross-Validation

검색결과 77건 처리시간 0.021초

Smoothing Parameter Selection Using Multifold Cross-Validation in Smoothing Spline Regressions

  • Hong, Changkon;Kim, Choongrak;Yoon, Misuk
    • Communications for Statistical Applications and Methods
    • /
    • 제5권2호
    • /
    • pp.277-285
    • /
    • 1998
  • The smoothing parameter $\lambda$ in smoothing spline regression is usually selected by minimizing cross-validation (CV) or generalized cross-validation (GCV). But, simple CV or GCV is poor candidate for estimating prediction error. We defined MGCV (Multifold Generalized Cross-validation) as a criterion for selecting smoothing parameter in smoothing spline regression. This is a version of cross-validation using $leave-\kappa-out$ method. Some numerical results comparing MGCV and GCV are done.

  • PDF

Diagnostic In Spline Regression Model With Heteroscedasticity

  • Lee, In-Suk;Jung, Won-Tae;Jeong, Hye-Jeong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제6권1호
    • /
    • pp.63-71
    • /
    • 1995
  • We have consider the study of local influence for smoothing parameter estimates in spline regression model with heteroscedasticity. Practically, generalized cross-validation does not work well in the presence of heteroscedasticity. Thus we have proposed the local influence measure for generalized cross-validation estimates when errors are heteroscedastic. And we have examined effects of diagnostic by above measures through Hyperinflation data.

  • PDF

GLOBAL GENERALIZED CROSS VALIDATION IN THE PRECONDITIONED GL-LSQR

  • Chung, Seiyoung;Oh, SeYoung;Kwon, SunJoo
    • 충청수학회지
    • /
    • 제32권1호
    • /
    • pp.149-156
    • /
    • 2019
  • This paper present the global generalized cross validation as the appropriate choice of the regularization parameter in the preconditioned Gl-LSQR method in solving image deblurring problems. The regularization parameter, chosen from the global generalized cross validation, with preconditioned Gl-LSQR method can give better reconstructions of the true image than other parameters considered in this study.

Multiclass LS-SVM ensemble for large data

  • Hwang, Hyungtae
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권6호
    • /
    • pp.1557-1563
    • /
    • 2015
  • Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.

PRECONDITIONED GL-CGLS METHOD USING REGULARIZATION PARAMETERS CHOSEN FROM THE GLOBAL GENERALIZED CROSS VALIDATION

  • Oh, SeYoung;Kwon, SunJoo
    • 충청수학회지
    • /
    • 제27권4호
    • /
    • pp.675-688
    • /
    • 2014
  • In this paper, we present an efficient way to determine a suitable value of the regularization parameter using the global generalized cross validation and analyze the experimental results from preconditioned global conjugate gradient linear least squares(Gl-CGLS) method in solving image deblurring problems. Preconditioned Gl-CGLS solves general linear systems with multiple right-hand sides. It has been shown in [10] that this method can be effectively applied to image deblurring problems. The regularization parameter, chosen from the global generalized cross validation, with preconditioned Gl-CGLS method can give better reconstructions of the true image than other parameters considered in this study.

Kernel Ridge Regression with Randomly Right Censored Data

  • Shim, Joo-Yong;Seok, Kyung-Ha
    • Communications for Statistical Applications and Methods
    • /
    • 제15권2호
    • /
    • pp.205-211
    • /
    • 2008
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The iterative reweighted least squares(IRWLS) procedure is employed to treat censored observations. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation(GCV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

Cox proportional hazard model with L1 penalty

  • Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권3호
    • /
    • pp.613-618
    • /
    • 2011
  • The proposed method is based on a penalized log partial likelihood of Cox proportional hazard model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log partial likelihood function of Cox proportional hazard model. It provide the ecient computation including variable selection and leads to the generalized cross validation function for the model selection. Experimental results are then presented to indicate the performance of the proposed procedure.

Censored Kernel Ridge Regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.1045-1052
    • /
    • 2005
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The weighted data are formed by redistributing the weights of the censored data to the uncensored data. Then kernel ridge regression can be taken up with the weighted data. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized approximate cross validation(GACV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

  • PDF

e-SVR using IRWLS Procedure

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.1087-1094
    • /
    • 2005
  • e-insensitive support vector regression(e-SVR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the quadratic problem of e-SVR with a modified loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of e-SVR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for e-SVR.

  • PDF

Semisupervised support vector quantile regression

  • Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권2호
    • /
    • pp.517-524
    • /
    • 2015
  • Unlabeled examples are easier and less expensive to be obtained than labeled examples. In this paper semisupervised approach is used to utilize such examples in an effort to enhance the predictive performance of nonlinear quantile regression problems. We propose a semisupervised quantile regression method named semisupervised support vector quantile regression, which is based on support vector machine. A generalized approximate cross validation method is used to choose the hyper-parameters that affect the performance of estimator. The experimental results confirm the successful performance of the proposed S2SVQR.