• 제목/요약/키워드: Regression Statistical Analysis

검색결과 3,392건 처리시간 0.03초

Canonical Correlation: Permutation Tests and Regression

  • Yoo, Jae-Keun;Kim, Hee-Youn;Um, Hye-Yeon
    • Communications for Statistical Applications and Methods
    • /
    • 제19권3호
    • /
    • pp.471-478
    • /
    • 2012
  • In this paper, we present a permutation test to select the number of pairs of canonical variates in canonical correlation analysis. The existing chi-squared test is known to be limited to normality in use. We compare the existing test with the proposed permutation test and study their asymptotic behaviors through numerical studies. In addition, we connect canonical correlation analysis to regression and we we show that certain inferences in regression can be done through canonical correlation analysis. A regression analysis of real data through canonical correlation analysis is illustrated.

Regression Quantile Estimators of a Nonlinear Time Series Regression Model

  • 김태수;허선;김해경
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2000년도 추계학술발표회 논문집
    • /
    • pp.13-15
    • /
    • 2000
  • In this paper, we deal with the asymptotic properties of the regression quantile estimators in the nonlinear time series regression model. For the sinusodial model which frequently appears fer a time series analysis, we study the strong consistency and asymptotic normality of regression quantile ostinators.

  • PDF

Multicollinarity in Logistic Regression

  • Jong-Han lee;Myung-Hoe Huh
    • Communications for Statistical Applications and Methods
    • /
    • 제2권2호
    • /
    • pp.303-309
    • /
    • 1995
  • Many measures to detect multicollinearity in linear regression have been proposed in statistics and numerical analysis literature. Among them, condition number and variance inflation factor(VIF) are most popular. In this study, we give new interpretations of condition number and VIF in linear regression, using geometry on the explanatory space. In the same line, we derive natural measures of condition number and VIF for logistic regression. These computer intensive measures can be easily extended to evaluate multicollinearity in generalized linear models.

  • PDF

On study for change point regression problems using a difference-based regression model

  • Park, Jong Suk;Park, Chun Gun;Lee, Kyeong Eun
    • Communications for Statistical Applications and Methods
    • /
    • 제26권6호
    • /
    • pp.539-556
    • /
    • 2019
  • This paper derive a method to solve change point regression problems via a process for obtaining consequential results using properties of a difference-based intercept estimator first introduced by Park and Kim (Communications in Statistics - Theory Methods, 2019) for outlier detection in multiple linear regression models. We describe the statistical properties of the difference-based regression model in a piecewise simple linear regression model and then propose an efficient algorithm for change point detection. We illustrate the merits of our proposed method in the light of comparison with several existing methods under simulation studies and real data analysis. This methodology is quite valuable, "no matter what regression lines" and "no matter what the number of change points".

비선형회귀분석을 위한 통계소프트웨어 NLIN2000 (Introduction of a Nonlinear Regression Analysis System NLIN2000)

  • 강근석;심규호
    • 응용통계연구
    • /
    • 제17권1호
    • /
    • pp.173-184
    • /
    • 2004
  • Window환경 하에서 사용이 간편하면서도 다양한 통계량을 제공하는 비선형회귀분석을 위 한 통계소프트웨어 NLIN2000을 소개한다. 기존의 DOS용 프로그램을 업그레이드한 것으로 다른 통계 팩키지들에 비하여 모형식의 설정 및 적합과정이 간편하고, 모형식 저장 및 삭제, 모형식 형태 보기 등의 기능을 제공한다. NLIN2000은 비선형회귀분석에 대한 통계적 이론을 연구하는 통계전공자들에게 필수적인 각종 통계량을 제공해줄 뿐만 아니라, 실제 현장에서 비선형모형을 사용하여 분석하는 다른 학문분야의 연구자들에게도 유용하게 사용될 수 있다.

Applications of response dimension reduction in large p-small n problems

  • Minjee Kim;Jae Keun Yoo
    • Communications for Statistical Applications and Methods
    • /
    • 제31권2호
    • /
    • pp.191-202
    • /
    • 2024
  • The goal of this paper is to show how multivariate regression analysis with high-dimensional responses is facilitated by the response dimension reduction. Multivariate regression, characterized by multi-dimensional response variables, is increasingly prevalent across diverse fields such as repeated measures, longitudinal studies, and functional data analysis. One of the key challenges in analyzing such data is managing the response dimensions, which can complicate the analysis due to an exponential increase in the number of parameters. Although response dimension reduction methods are developed, there is no practically useful illustration for various types of data such as so-called large p-small n data. This paper aims to fill this gap by showcasing how response dimension reduction can enhance the analysis of high-dimensional response data, thereby providing significant assistance to statistical practitioners and contributing to advancements in multiple scientific domains.

Sensitivity Analysis in Latent Root Regression

  • Shin, Jae-Kyoung;Tomoyuki Tarumi;Yutaka Tanaka
    • Communications for Statistical Applications and Methods
    • /
    • 제1권1호
    • /
    • pp.102-111
    • /
    • 1994
  • We Propose a method of sensitivity analysis in latent root regression analysis (LRRA). For this purpose we derive the quantities ${\beta\limits^\wedge \;_{LRR}}^{(1)}$, which correspond to the theoretical influence function $I(x, y \;;\;\beta\limits^\wedge \;_{LRR})$ for the regression coefficient ${\beta\limits^\wedge}_{LRR}$ based on LRRA. We give a numerical example for illustration and also investigate numerically the relationship between the estimated values of ${\beta\limits^\wedge \;_{LRR}}^{(1)}$ with the values of the other measures called sample influence curve(SIC) based on the recomputation for the data with a single observation deleted. We also discuss the comparision among the results of LRRA, ordinary least square regression analysis (OLSRA) and ridge regression analysis(RRA).

  • PDF

Hybrid Fuzzy Least Squares Support Vector Machine Regression for Crisp Input and Fuzzy Output

  • Shim, Joo-Yong;Seok, Kyung-Ha;Hwang, Chang-Ha
    • Communications for Statistical Applications and Methods
    • /
    • 제17권2호
    • /
    • pp.141-151
    • /
    • 2010
  • Hybrid fuzzy regression analysis is used for integrating randomness and fuzziness into a regression model. Least squares support vector machine(LS-SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate hybrid fuzzy linear and nonlinear regression models with crisp inputs and fuzzy output using weighted fuzzy arithmetic(WFA) and LS-SVM. LS-SVM allows us to perform fuzzy nonlinear regression analysis by constructing a fuzzy linear regression function in a high dimensional feature space. The proposed method is not computationally expensive since its solution is obtained from a simple linear equation system. In particular, this method is a very attractive approach to modeling nonlinear data, and is nonparametric method in the sense that we do not have to assume the underlying model function for fuzzy nonlinear regression model with crisp inputs and fuzzy output. Experimental results are then presented which indicate the performance of this method.

Iterative projection of sliced inverse regression with fused approach

  • Han, Hyoseon;Cho, Youyoung;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • 제28권2호
    • /
    • pp.205-215
    • /
    • 2021
  • Sufficient dimension reduction is useful dimension reduction tool in regression, and sliced inverse regression (Li, 1991) is one of the most popular sufficient dimension reduction methodologies. In spite of its popularity, it is known to be sensitive to the number of slices. To overcome this shortcoming, the so-called fused sliced inverse regression is proposed by Cook and Zhang (2014). Unfortunately, the two existing methods do not have the direction application to large p-small n regression, in which the dimension reduction is desperately needed. In this paper, we newly propose seeded sliced inverse regression and seeded fused sliced inverse regression to overcome this deficit by adopting iterative projection approach (Cook et al., 2007). Numerical studies are presented to study their asymptotic estimation behaviors, and real data analysis confirms their practical usefulness in high-dimensional data analysis.

Robust Regression and Stratified Residuals for Left-Truncated and Right-Censored Data

  • Kim, Chul-Ki
    • Journal of the Korean Statistical Society
    • /
    • 제26권3호
    • /
    • pp.333-354
    • /
    • 1997
  • Computational algorithms to calculate M-estimators and rank estimators of regression parameters from left-truncated and right-censored data are developed herein. In the case of M-estimators, new statistical methods are also introduced to incorporate leverage assements and concomitant scale estimation in the presence of left truncation and right censoring on the observed response. Furthermore, graphical methods to examine the residuals from these data are presented. Two real data sets are used for illustration.

  • PDF