• Title/Summary/Keyword: Principal

Search Result 7,151, Processing Time 0.031 seconds

Principal Component Regression by Principal Component Selection

  • Lee, Hosung;Park, Yun Mi;Lee, Seokho
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.2
    • /
    • pp.173-180
    • /
    • 2015
  • We propose a selection procedure of principal components in principal component regression. Our method selects principal components using variable selection procedures instead of a small subset of major principal components in principal component regression. Our procedure consists of two steps to improve estimation and prediction. First, we reduce the number of principal components using the conventional principal component regression to yield the set of candidate principal components and then select principal components among the candidate set using sparse regression techniques. The performance of our proposals is demonstrated numerically and compared with the typical dimension reduction approaches (including principal component regression and partial least square regression) using synthetic and real datasets.

Classification via principal differential analysis

  • Jang, Eunseong;Lim, Yaeji
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.2
    • /
    • pp.135-150
    • /
    • 2021
  • We propose principal differential analysis based classification methods. Computations of squared multiple correlation function (RSQ) and principal differential analysis (PDA) scores are reviewed; in addition, we combine principal differential analysis results with the logistic regression for binary classification. In the numerical study, we compare the principal differential analysis based classification methods with functional principal component analysis based classification. Various scenarios are considered in a simulation study, and principal differential analysis based classification methods classify the functional data well. Gene expression data is considered for real data analysis. We observe that the PDA score based method also performs well.

ON GRADED RADICALLY PRINCIPAL IDEALS

  • Abu-Dawwas, Rashid
    • Bulletin of the Korean Mathematical Society
    • /
    • v.58 no.6
    • /
    • pp.1401-1407
    • /
    • 2021
  • Let R be a commutative G-graded ring with a nonzero unity. In this article, we introduce the concept of graded radically principal ideals. A graded ideal I of R is said to be graded radically principal if Grad(I) = Grad(〈c〉) for some homogeneous c ∈ R, where Grad(I) is the graded radical of I. The graded ring R is said to be graded radically principal if every graded ideal of R is graded radically principal. We study graded radically principal rings. We prove an analogue of the Cohen theorem, in the graded case, precisely, a graded ring is graded radically principal if and only if every graded prime ideal is graded radically principal. Finally we study the graded radically principal property for the polynomial ring R[X].

Numerical Investigations in Choosing the Number of Principal Components in Principal Component Regression - CASE I

  • Shin, Jae-Kyoung;Moon, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.127-134
    • /
    • 1997
  • A method is proposed for the choice of the number of principal components in principal component regression based on the predicted error sum of squares. To do this, we approximately evaluate that statistic using a linear approximation based on the perturbation expansion. In this paper, we apply the proposed method to various data sets and discuss some properties in choosing the number of principal components in principal component regression.

  • PDF

Assessment and Classification of Korean Indigenous Corn Lines by Application of Principal Component Analysis (주성분분석에 의한 재래종 옥수수의 해석)

  • 이인섭;박종옥
    • Journal of Life Science
    • /
    • v.13 no.3
    • /
    • pp.343-348
    • /
    • 2003
  • This study was conducted to get basic information on the Korean local corn line collected from Busan City and Kyungnam Province, a total of 49 lines were selected and assessed by the principal component analysis method. In the result of principal component analysis for 7 characteristics, 67.4% and 86.3% of total variation could be appreciated by the first two and first four principal components, respectively. Contribution of characteristics to principal component was high at upper principal components and low at lower principal components. Biological meaning of principal component and plant types corresponding to the each principal component were explained clearly by the correlation coefficient between principal component and characteristics. The first principal component appeared to correspond to the size of plant and ear, and the duration of vegetative growing period. The second principal component appeared to correspond to the number of ear and tiller. But the meaning of the third and fourth principal components were not clear.

Arrow Diagrams for Kernel Principal Component Analysis

  • Huh, Myung-Hoe
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.3
    • /
    • pp.175-184
    • /
    • 2013
  • Kernel principal component analysis(PCA) maps observations in nonlinear feature space to a reduced dimensional plane of principal components. We do not need to specify the feature space explicitly because the procedure uses the kernel trick. In this paper, we propose a graphical scheme to represent variables in the kernel principal component analysis. In addition, we propose an index for individual variables to measure the importance in the principal component plane.

Numerical Investigations in Choosing the Number of Principal Components in Principal Component Regression - CASE II

  • Shin, Jae-Kyoung;Moon, Sung-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.1
    • /
    • pp.163-172
    • /
    • 1999
  • We propose a cross-validatory method for the choice of the number of principal components in principal component regression based on the magnitudes of correlations with y. There are two different manners in choosing principal components, one is the order of eigenvalues(Shin and Moon, 1997) and the other is that of correlations with y. We apply our method to various data sets and compare results of those two methods.

  • PDF

AN EFFICIENT ALGORITHM FOR SLIDING WINDOW BASED INCREMENTAL PRINCIPAL COMPONENTS ANALYSIS

  • Lee, Geunseop
    • Journal of the Korean Mathematical Society
    • /
    • v.57 no.2
    • /
    • pp.401-414
    • /
    • 2020
  • It is computationally expensive to compute principal components from scratch at every update or downdate when new data arrive and existing data are truncated from the data matrix frequently. To overcome this limitations, incremental principal component analysis is considered. Specifically, we present a sliding window based efficient incremental principal component computation from a covariance matrix which comprises of two procedures; simultaneous update and downdate of principal components, followed by the rank-one matrix update. Additionally we track the accurate decomposition error and the adaptive numerical rank. Experiments show that the proposed algorithm enables a faster execution speed and no-meaningful decomposition error differences compared to typical incremental principal component analysis algorithms, thereby maintaining a good approximation for the principal components.

Simple principal component analysis using Lasso (라소를 이용한 간편한 주성분분석)

  • Park, Cheolyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.3
    • /
    • pp.533-541
    • /
    • 2013
  • In this study, a simple principal component analysis using Lasso is proposed. This method consists of two steps. The first step is to compute principal components by the principal component analysis. The second step is to regress each principal component on the original data matrix by Lasso regression method. Each of new principal components is computed as the linear combination of original data matrix using the scaled estimated Lasso regression coefficient as the coefficients of the combination. This method leads to easily interpretable principal components with more 0 coefficients by the properties of Lasso regression models. This is because the estimator of the regression of each principal component on the original data matrix is the corresponding eigenvector. This method is applied to real and simulated data sets with the help of an R package for Lasso regression and its usefulness is demonstrated.

Risk Evaluation of Slope Using Principal Component Analysis (PCA) (주성분분석을 이용한 사면의 위험성 평가)

  • Jung, Soo-Jung;Kim, -Yong-Soo;Kim, Tae-Hyung
    • Journal of the Korean Geotechnical Society
    • /
    • v.26 no.10
    • /
    • pp.69-79
    • /
    • 2010
  • To detect abnormal events in slopes, Principal Component Analysis (PCA) is applied to the slope that was collapsed during monitoring. Principal component analysis is a kind of statical methods and is called non-parametric modeling. In this analysis, principal component score indicates an abnormal behavior of slope. In an abnormal event, principal component score is relatively higher or lower compared to a normal situation so that there is a big score change in the case of abnormal. The results confirm that the abnormal events and collapses of slope were detected by using principal component analysis. It could be possible to predict quantitatively the slope behavior and abnormal events using principal component analysis.