• 제목/요약/키워드: Ridge Regression

검색결과 118건 처리시간 0.024초

Combining Ridge Regression and Latent Variable Regression

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권1호
    • /
    • pp.51-61
    • /
    • 2007
  • Ridge regression (RR), principal component regression (PCR) and partial least squares regression (PLS) are among popular regression methods for collinear data. While RR adds a small quantity called ridge constant to the diagonal of X'X to stabilize the matrix inversion and regression coefficients, PCR and PLS use latent variables derived from original variables to circumvent the collinearity problem. One problem of PCR and PLS is that they are very sensitive to overfitting. A new regression method is presented by combining RR and PCR and PLS, respectively, in a unified manner. It is intended to provide better predictive ability and improved stability for regression models. A real-world data from NIR spectroscopy is used to investigate the performance of the newly developed regression method.

  • PDF

Determination of Research Octane Number using NIR Spectral Data and Ridge Regression

  • 정호일;이혜선;전지혁
    • Bulletin of the Korean Chemical Society
    • /
    • 제22권1호
    • /
    • pp.37-42
    • /
    • 2001
  • Ridge regression is compared with multiple linear regression (MLR) for determination of Research Octane Number (RON) when the baseline and signal-to-noise ratio are varied. MLR analysis of near-infrared (NIR) spectroscopic data usually encounters a collinearity problem, which adversely affects long-term prediction performance. The collinearity problem can be eliminated or greatly improved by using ridge regression, which is a biased estimation method. To evaluate the robustness of each calibration, the calibration models developed by both calibration methods were used to predict RONs of gasoline spectra in which the baseline and signal-to-noise ratio were varied. The prediction results of a ridge calibration model showed more stable prediction performance as compared to that of MLR, especially when the spectral baselines were varied. . In conclusion, ridge regression is shown to be a viable method for calibration of RON with the NIR data when only a few wavelengths are available such as hand-carry device using a few diodes.

Shrinkage Structure of Ridge Partial Least Squares Regression

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권2호
    • /
    • pp.327-344
    • /
    • 2007
  • 다중공선성의 데이터에 사용되는 대표적인 편향회귀방법은 능형회귀(RR), 주성분회귀(PCR), 부분최소제곱회귀(PLS) 등이다. 이 회귀방법들은 계수베거 추정량의 놈(norm)이 모두 보통 최소제곱회귀(OLS)의 추정량의 놈보다 작아진다는 의미에서 축소회귀라 부른다. 새로운 회귀방법으로 RR과 PCR을 결합한 능형주성분회귀(RPCR)가 있고 RR과 PLS를 결합한 능형부분최소제곱회귀(RPLS)가 있으며 이들도 또한 축소회귀이다. 이들 추정량은 X'X의 고유벡터들의 선형결합으로 나타낼 수 있고 따라서 각 고유방향에서 OLS에 비해 얼마나 축소되는지를 연구할 수 있다. 본 논문에서는 먼저 이들 추정량을 일반적인 축소인자의 식으로 나타내고 이를 이용하여 MSE의 일반식을 구하였으며 PLS 추정량의 MSE 식도 구하였다. 그리고 RPLS의 축소인자 식을 두 가지 다른 형태로 유도하였다. RPLS의 경우도 이 축소인자 식을 MSE의 일반식에 대입하면 MSE 식이 바로 얻어진다. 그러나 PLS나 RPLS의 축소인자는 y의 복잡한 비선형이 되어 결정적이 아니므로 이들 추정량의 MSE는 근사적인 식이라 할 수 있다. 따라서 PLS나 RPLS를 평가하기 위해 이 MSE를 사용하는 것은 제한적이며, 경험적인 방법으로 이들 회귀의 수행성을 평가하는 것이 필요하다. 다중공선성의 대표적인 데이터인 근적외선 분광 데이터를 이용하여 이 유도된 회귀의 축소인자 값이 인자수에 따라 어떻게 변화하는지와 전체적인 축소 비율도 살펴보았다. 이들의 축소 형태를 잘 이해하면 회귀방법들의 예측력과 안정성을 파악하는데 많은 도움이 되리라 판단된다.

  • PDF

Censored Kernel Ridge Regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.1045-1052
    • /
    • 2005
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The weighted data are formed by redistributing the weights of the censored data to the uncensored data. Then kernel ridge regression can be taken up with the weighted data. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized approximate cross validation(GACV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

  • PDF

ROBUST CROSS VALIDATIONS IN RIDGE REGRESSION

  • Jung, Kang-Mo
    • Journal of applied mathematics & informatics
    • /
    • 제27권3_4호
    • /
    • pp.903-908
    • /
    • 2009
  • The shrink parameter in ridge regression may be contaminated by outlying points. We propose robust cross validation scores in ridge regression instead of classical cross validation. We use robust location estimators such as median, least trimmed squares, absolute mean for robust cross validation scores. The robust scores have global robustness. Simulations are performed to show the effectiveness of the proposed estimators.

  • PDF

능형 회귀에서의 민감도 분석에 관한 연구 (A Study on Sensitivity Analysis in Ridge Regression)

  • Kim, Soon-Kwi
    • 품질경영학회지
    • /
    • 제19권1호
    • /
    • pp.1-15
    • /
    • 1991
  • In this paper, we discuss and review various measures which have been presented for studying outliers, high-leverage points, and influential observations when ridge regression estimation is adopted. We derive the influence function for ${\underline{\hat{\beta}}}\small{R}$, the ridge regression estimator, and discuss its various finite sample approximations when ridge regression is postulated. We also study several diagnostic measures such as Welsh-Kuh's distance, Cook's distance etc.

  • PDF

Estimation of Ridge Regression Under the Integrate Mean Square Error Cirterion

  • Yong B. Lim;Park, Chi H.;Park, Sung H.
    • Journal of the Korean Statistical Society
    • /
    • 제9권1호
    • /
    • pp.61-77
    • /
    • 1980
  • In response surface experiments, a polynomial model is often used to fit the response surface by the method of least squares. However, if the vectors of predictor variables are multicollinear, least squares estimates of the regression parameters have a high probability of being unsatisfactory. Hoerland Kennard have demonstrated that these undesirable effects of multicollinearity can be reduced by using "ridge" estimates in place of the least squares estimates. Ridge regrssion theory in literature has been mainly concerned with selection of k for the first order polynomial regression model and the precision of $\hat{\beta}(k)$, the ridge estimator of regression parameters. The problem considered in this paper is that of selecting k of ridge regression for a given polynomial regression model with an arbitrary order. A criterion is proposed for selection of k in the context of integrated mean square error of fitted responses, and illustrated with an example. Also, a type of admissibility condition is established and proved for the propose criterion.criterion.

  • PDF

A Graphical Method for Evaluating the Mixture Component Effects of Ridge Regression Estimator in Mixture Experiments

  • Jang, Dae-Heung
    • Communications for Statistical Applications and Methods
    • /
    • 제6권1호
    • /
    • pp.1-10
    • /
    • 1999
  • When the component proportions in mixture experiments are restricted by lower and upper bounds multicollinearity appears all too frequently. The ridge regression can be used to stabilize the coefficient estimates in the fitted model. I propose a graphical method for evaluating the mixture component effects of ridge regression estimator with respect to the prediction variance and the prediction bias.

  • PDF

준지도 커널능형회귀모형에 관한 연구 (A study on semi-supervised kernel ridge regression estimation)

  • 석경하
    • Journal of the Korean Data and Information Science Society
    • /
    • 제24권2호
    • /
    • pp.341-353
    • /
    • 2013
  • 데이터마이닝과 기계학습의 응용분야에서는 라벨 없는 자료를 이용하는 연구가 많이 진행되고 있다. 이러한 연구는 분류문제에 집중되었다가 최근에 회귀분석문제로 관심이 모아지고 있다. 본 연구에서는 커널능형회귀모형 형태의 준지도 회귀분석 방법을 제시한다. 제안된 방법은 기존의 전환적 방법과는 달리 라벨 없는 자료의 라벨을 추정하는 과정을 필요로 하지 않기 때문에 선택해야 할 모수의 수도 적고, 계산과정도 단순할 뿐 아니라 일반화에 강점이 있다. 모의실험과 실제 자료 분석을 통해 제안된 방법이 라벨 없는 자료를 잘 활용하여 라벨 있는 자료만 이용하는 방법보다 더 우수한 추정을 하는 것을 볼 수 있었다.