• 제목/요약/키워드: sliced inverse regression

검색결과 22건 처리시간 0.021초

Classification Using Sliced Inverse Regression and Sliced Average Variance Estimation

  • Lee, Hakbae
    • Communications for Statistical Applications and Methods
    • /
    • 제11권2호
    • /
    • pp.275-285
    • /
    • 2004
  • We explore classification analysis using graphical methods such as sliced inverse regression and sliced average variance estimation based on dimension reduction. Some useful information about classification analysis are obtained by sliced inverse regression and sliced average variance estimation through dimension reduction. Two examples are illustrated, and classification rates by sliced inverse regression and sliced average variance estimation are compared with those by discriminant analysis and logistic regression.

Iterative projection of sliced inverse regression with fused approach

  • Han, Hyoseon;Cho, Youyoung;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • 제28권2호
    • /
    • pp.205-215
    • /
    • 2021
  • Sufficient dimension reduction is useful dimension reduction tool in regression, and sliced inverse regression (Li, 1991) is one of the most popular sufficient dimension reduction methodologies. In spite of its popularity, it is known to be sensitive to the number of slices. To overcome this shortcoming, the so-called fused sliced inverse regression is proposed by Cook and Zhang (2014). Unfortunately, the two existing methods do not have the direction application to large p-small n regression, in which the dimension reduction is desperately needed. In this paper, we newly propose seeded sliced inverse regression and seeded fused sliced inverse regression to overcome this deficit by adopting iterative projection approach (Cook et al., 2007). Numerical studies are presented to study their asymptotic estimation behaviors, and real data analysis confirms their practical usefulness in high-dimensional data analysis.

Fused sliced inverse regression in survival analysis

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • 제24권5호
    • /
    • pp.533-541
    • /
    • 2017
  • Sufficient dimension reduction (SDR) replaces original p-dimensional predictors to a lower-dimensional linearly transformed predictor. The sliced inverse regression (SIR) has the longest and most popular history of SDR methodologies. The critical weakness of SIR is its known sensitive to the numbers of slices. Recently, a fused sliced inverse regression is developed to overcome this deficit, which combines SIR kernel matrices constructed from various choices of the number of slices. In this paper, the fused sliced inverse regression and SIR are compared to show that the former has a practical advantage in survival regression over the latter. Numerical studies confirm this and real data example is presented.

On robustness in dimension determination in fused sliced inverse regression

  • Yoo, Jae Keun;Cho, Yoo Na
    • Communications for Statistical Applications and Methods
    • /
    • 제25권5호
    • /
    • pp.513-521
    • /
    • 2018
  • The goal of sufficient dimension reduction (SDR) is to replace original p-dimensional predictors with a lower-dimensional linearly transformed predictor. The sliced inverse regression (SIR) (Li, Journal of the American Statistical Association, 86, 316-342, 1991) is one of the most popular SDR methods because of its applicability and simple implementation in practice. However, SIR may yield different dimension reduction results for different numbers of slices and despite its popularity, is a clear deficit for SIR. To overcome this, a fused sliced inverse regression was recently proposed. The study shows that the dimension-reduced predictors is robust to the numbers of the slices, but it does not investigate how robust its dimension determination is. This paper suggests a permutation dimension determination for the fused sliced inverse regression that is compared with SIR to investigate the robustness to the numbers of slices in the dimension determination. Numerical studies confirm this and a real data example is presented.

다변량 분할 역회귀모형에 관한 연구 (A study on the multivariate sliced inverse regression)

  • 이용구;이덕기
    • 응용통계연구
    • /
    • 제10권2호
    • /
    • pp.293-308
    • /
    • 1997
  • 일변량 분할 역회귀 방법은 일반화 회귀모형에서 효과적인 차원축약방향과 공간을 추정하는 방법이다. 본 논문에서는 두 일반화 회귀모형을 동시에 고려하여 효과적인 차원축약방향과 공간을 추정하는 방법으로 이변량 분할 역회귀를 제안한다. 이러한 이변량 분할 역회귀 방법은 모형식이 선형, 이차형, 삼차형, 비선형 등의 여러 모형식에서 효과적인 차원축약방향을 추정하며, 일변량 분할 역회귀에 비하여 모형에 존재하는 오차에 크게 영향을 받지 않고 효과적인 차원축약방향을 추정한다. 특히 모형식이 대칭의 이차형인 경우에 일변량 분할 역회귀 방법이 효과적인 차원축약방향을 추정하지 못하는 문제를 해결할 수 있다.

  • PDF

Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties

  • Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • 제14권1호
    • /
    • pp.215-227
    • /
    • 2007
  • Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.

A selective review of nonlinear sufficient dimension reduction

  • Sehun Jang;Jun Song
    • Communications for Statistical Applications and Methods
    • /
    • 제31권2호
    • /
    • pp.247-262
    • /
    • 2024
  • In this paper, we explore nonlinear sufficient dimension reduction (SDR) methods, with a primary focus on establishing a foundational framework that integrates various nonlinear SDR methods. We illustrate the generalized sliced inverse regression (GSIR) and the generalized sliced average variance estimation (GSAVE) which are fitted by the framework. Further, we delve into nonlinear extensions of inverse moments through the kernel trick, specifically examining the kernel sliced inverse regression (KSIR) and kernel canonical correlation analysis (KCCA), and explore their relationships within the established framework. We also briefly explain the nonlinear SDR for functional data. In addition, we present practical aspects such as algorithmic implementations. This paper concludes with remarks on the dimensionality problem of the target function class.

분할 역회귀모형에서 차원결정을 위한 점근검정법 (Asymptotic Test for Dimensionality in Sliced Inverse Regression)

  • 박종선;곽재근
    • 응용통계연구
    • /
    • 제18권2호
    • /
    • pp.381-393
    • /
    • 2005
  • 회귀모형에서 필요한 설명변수들의 선형결합들을 탐색하기 위한 방법 중의 하나로 분할역회귀모형을 들 수 있다. 이러한 분할역회귀모형에서 모형에 필요한 설명변수들의 선형결합의 수, 즉 차원을 결정하기 위한 여러 가지의 검정법들이 소개 되었으나 설명변수들의 정규성 가정을 필요로 하거나 다른 제약이 있다. 본 논문에서는 주성분분석에 대한 확률모형을 이 용하여 정규성가정을 필요로하지 않으며 분할의 수에 로버스트한 검정법을 소개하고 모의실험과 실제자료에 대한 적용결과를 통하여 기존의 검정법과 비교하였다.

Model-based inverse regression for mixture data

  • Choi, Changhwan;Park, Chongsun
    • Communications for Statistical Applications and Methods
    • /
    • 제24권1호
    • /
    • pp.97-113
    • /
    • 2017
  • This paper proposes a method for sufficient dimension reduction (SDR) of mixture data. We consider mixture data containing more than one component that have distinct central subspaces. We adopt an approach of a model-based sliced inverse regression (MSIR) to the mixture data in a simple and intuitive manner. We employed mixture probabilistic principal component analysis (MPPCA) to estimate each central subspaces and cluster the data points. The results from simulation studies and a real data set show that our method is satisfactory to catch appropriate central spaces and is also robust regardless of the number of slices chosen. Discussions about root selection, estimation accuracy, and classification with initial value issues of MPPCA and its related simulation results are also provided.

Tutorial: Methodologies for sufficient dimension reduction in regression

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • 제23권2호
    • /
    • pp.105-117
    • /
    • 2016
  • In the paper, as a sequence of the first tutorial, we discuss sufficient dimension reduction methodologies used to estimate central subspace (sliced inverse regression, sliced average variance estimation), central mean subspace (ordinary least square, principal Hessian direction, iterative Hessian transformation), and central $k^{th}$-moment subspace (covariance method). Large-sample tests to determine the structural dimensions of the three target subspaces are well derived in most of the methodologies; however, a permutation test (which does not require large-sample distributions) is introduced. The test can be applied to the methodologies discussed in the paper. Theoretical relationships among the sufficient dimension reduction methodologies are also investigated and real data analysis is presented for illustration purposes. A seeded dimension reduction approach is then introduced for the methodologies to apply to large p small n regressions.