• Title/Summary/Keyword: principal component regression

Search Result 251, Processing Time 0.026 seconds

Principal Component Regression by Principal Component Selection

  • Lee, Hosung;Park, Yun Mi;Lee, Seokho
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.2
    • /
    • pp.173-180
    • /
    • 2015
  • We propose a selection procedure of principal components in principal component regression. Our method selects principal components using variable selection procedures instead of a small subset of major principal components in principal component regression. Our procedure consists of two steps to improve estimation and prediction. First, we reduce the number of principal components using the conventional principal component regression to yield the set of candidate principal components and then select principal components among the candidate set using sparse regression techniques. The performance of our proposals is demonstrated numerically and compared with the typical dimension reduction approaches (including principal component regression and partial least square regression) using synthetic and real datasets.

Numerical Investigations in Choosing the Number of Principal Components in Principal Component Regression - CASE I

  • Shin, Jae-Kyoung;Moon, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.127-134
    • /
    • 1997
  • A method is proposed for the choice of the number of principal components in principal component regression based on the predicted error sum of squares. To do this, we approximately evaluate that statistic using a linear approximation based on the perturbation expansion. In this paper, we apply the proposed method to various data sets and discuss some properties in choosing the number of principal components in principal component regression.

  • PDF

Simple principal component analysis using Lasso (라소를 이용한 간편한 주성분분석)

  • Park, Cheolyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.3
    • /
    • pp.533-541
    • /
    • 2013
  • In this study, a simple principal component analysis using Lasso is proposed. This method consists of two steps. The first step is to compute principal components by the principal component analysis. The second step is to regress each principal component on the original data matrix by Lasso regression method. Each of new principal components is computed as the linear combination of original data matrix using the scaled estimated Lasso regression coefficient as the coefficients of the combination. This method leads to easily interpretable principal components with more 0 coefficients by the properties of Lasso regression models. This is because the estimator of the regression of each principal component on the original data matrix is the corresponding eigenvector. This method is applied to real and simulated data sets with the help of an R package for Lasso regression and its usefulness is demonstrated.

Numerical Investigations in Choosing the Number of Principal Components in Principal Component Regression - CASE II

  • Shin, Jae-Kyoung;Moon, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.1
    • /
    • pp.163-172
    • /
    • 1999
  • We propose a cross-validatory method for the choice of the number of principal components in principal component regression based on the magnitudes of correlations with y. There are two different manners in choosing principal components, one is the order of eigenvalues(Shin and Moon, 1997) and the other is that of correlations with y. We apply our method to various data sets and compare results of those two methods.

  • PDF

On Sensitivity Analysis in Principal Component Regression

  • Kim, Soon-Kwi;Park, Sung H.
    • Journal of the Korean Statistical Society
    • /
    • v.20 no.2
    • /
    • pp.177-190
    • /
    • 1991
  • In this paper, we discuss and review various measures which have been presented for studying outliers. high-leverage points, and influential observations when principal component regression is adopted. We suggest several diagnostics measures when principal component regression is used. A numerical example is illustrated. Some individual data points may be flagged as outliers, high-leverage point, or influential points.

  • PDF

Sensitivity Analysis in Principal Component Regression with Quadratic Approximation

  • Shin, Jae-Kyoung;Chang, Duk-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.3
    • /
    • pp.623-630
    • /
    • 2003
  • Recently, Tanaka(1988) derived two influence functions related to an eigenvalue problem $(A-\lambda_sI)\upsilon_s=0$ of real symmetric matrix A and used them for sensitivity analysis in principal component analysis. In this paper, we deal with the perturbation expansions up to quadratic terms of the same functions and discuss the application to sensitivity analysis in principal component regression analysis(PCRA). Numerical example is given to show how the approximation improves with the quadratic term.

  • PDF

Predicting Korea Pro-Baseball Rankings by Principal Component Regression Analysis (주성분회귀분석을 이용한 한국프로야구 순위)

  • Bae, Jae-Young;Lee, Jin-Mok;Lee, Jea-Young
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.3
    • /
    • pp.367-379
    • /
    • 2012
  • In baseball rankings, prediction has been a subject of interest for baseball fans. To predict these rankings, (based on 2011 data from Korea Professional Baseball records) the arithmetic mean method, the weighted average method, principal component analysis, and principal component regression analysis is presented. By standardizing the arithmetic average, the correlation coefficient using the weighted average method, using principal components analysis to predict rankings, the final model was selected as a principal component regression model. By practicing regression analysis with a reduced variable by principal component analysis, we propose a rank predictability model of a pitcher part, a batter part and a pitcher batter part. We can estimate a 2011 rank of pro-baseball by a predicted regression model. By principal component regression analysis, the pitcher part, the other part, the pitcher and the batter part of the ranking prediction model is proposed. The regression model predicts the rankings for 2012.

Principal component regression for spatial data (공간자료 주성분분석)

  • Lim, Yaeji
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.3
    • /
    • pp.311-321
    • /
    • 2017
  • Principal component analysis is a popular statistical method to reduce the dimension of the high dimensional climate data and to extract meaningful climate patterns. Based on the principal component analysis, we can further apply a regression approach for the linear prediction of future climate, termed as principal component regression (PCR). In this paper, we develop a new PCR method based on the regularized principal component analysis for spatial data proposed by Wang and Huang (2016) to account spatial feature of the climate data. We apply the proposed method to temperature prediction in the East Asia region and compare the result with conventional PCR results.

A study on the properties of sensitivity analysis in principal component regression and latent root regression (주성분회귀와 고유값회귀에 대한 감도분석의 성질에 대한 연구)

  • Shin, Jae-Kyoung;Chang, Duk-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.2
    • /
    • pp.321-328
    • /
    • 2009
  • In regression analysis, the ordinary least squares estimates of regression coefficients become poor, when the correlations among predictor variables are high. This phenomenon, which is called multicollinearity, causes serious problems in actual data analysis. To overcome this multicollinearity, many methods have been proposed. Ridge regression, shrinkage estimators and methods based on principal component analysis (PCA) such as principal component regression (PCR) and latent root regression (LRR). In the last decade, many statisticians discussed sensitivity analysis (SA) in ordinary multiple regression and same topic in PCR, LRR and logistic principal component regression (LPCR). In those methods PCA plays important role. Many statisticians discussed SA in PCA and related multivariate methods. We introduce the method of PCR and LRR. We also introduce the methods of SA in PCR and LRR, and discuss the properties of SA in PCR and LRR.

  • PDF

Combining Ridge Regression and Latent Variable Regression

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.1
    • /
    • pp.51-61
    • /
    • 2007
  • Ridge regression (RR), principal component regression (PCR) and partial least squares regression (PLS) are among popular regression methods for collinear data. While RR adds a small quantity called ridge constant to the diagonal of X'X to stabilize the matrix inversion and regression coefficients, PCR and PLS use latent variables derived from original variables to circumvent the collinearity problem. One problem of PCR and PLS is that they are very sensitive to overfitting. A new regression method is presented by combining RR and PCR and PLS, respectively, in a unified manner. It is intended to provide better predictive ability and improved stability for regression models. A real-world data from NIR spectroscopy is used to investigate the performance of the newly developed regression method.

  • PDF