• 제목/요약/키워드: penalized regression

검색결과 78건 처리시간 0.019초

Bayesian Confidence Intervals in Penalized Likelihood Regression

  • Kim Young-Ju
    • Communications for Statistical Applications and Methods
    • /
    • 제13권1호
    • /
    • pp.141-150
    • /
    • 2006
  • Penalized likelihood regression for exponential families have been considered by Kim (2005) through smoothing parameter selection and asymptotically efficient low dimensional approximations. We derive approximate Bayesian confidence intervals based on Bayes model associated with lower dimensional approximations to provide interval estimates in penalized likelihood regression and conduct empirical studies to access their properties.

A note on standardization in penalized regressions

  • Lee, Sangin
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권2호
    • /
    • pp.505-516
    • /
    • 2015
  • We consider sparse high-dimensional linear regression models. Penalized regressions have been used as effective methods for variable selection and estimation in high-dimensional models. In penalized regressions, it is common practice to standardize variables before fitting a penalized model and then fit a penalized model with standardized variables. Finally, the estimated coefficients from a penalized model are recovered to the scale on original variables. However, these procedures produce a slightly different solution compared to the corresponding original penalized problem. In this paper, we investigate issues on the standardization of variables in penalized regressions and formulate the definition of the standardized penalized estimator. In addition, we compare the original penalized estimator with the standardized penalized estimator through simulation studies and real data analysis.

Penalized Likelihood Regression with Negative Binomial Data with Unknown Shape Parameter

  • Kim, Young-Ju
    • Communications for Statistical Applications and Methods
    • /
    • 제14권1호
    • /
    • pp.23-32
    • /
    • 2007
  • We consider penalized likelihood regression with data from the negative binomial distribution with unknown shape parameter. Smoothing parameter selection and asymptotically efficient low dimensional approximations are employed for negative binomial data along with shape parameter estimation through several different algorithms.

Two-Stage Penalized Composite Quantile Regression with Grouped Variables

  • Bang, Sungwan;Jhun, Myoungshic
    • Communications for Statistical Applications and Methods
    • /
    • 제20권4호
    • /
    • pp.259-270
    • /
    • 2013
  • This paper considers a penalized composite quantile regression (CQR) that performs a variable selection in the linear model with grouped variables. An adaptive sup-norm penalized CQR (ASCQR) is proposed to select variables in a grouped manner; in addition, the consistency and oracle property of the resulting estimator are also derived under some regularity conditions. To improve the efficiency of estimation and variable selection, this paper suggests the two-stage penalized CQR (TSCQR), which uses the ASCQR to select relevant groups in the first stage and the adaptive lasso penalized CQR to select important variables in the second stage. Simulation studies are conducted to illustrate the finite sample performance of the proposed methods.

벌점 스플라인 회귀모형에서의 이상치 탐지방법 (An Outlier Detection Method in Penalized Spline Regression Models)

  • 서한손;송지은;윤민
    • 응용통계연구
    • /
    • 제26권4호
    • /
    • pp.687-696
    • /
    • 2013
  • 이상치가 존재하는 경우 모형 적합의 결과가 왜곡될 수 있기 때문에 이상치 탐색은 데이터분석에 있어서 매우 중요하다. 이상치 탐지 방법은 많은 학자들에 의해 연구되어 왔다. 본 논문에서는 Hadi와 Simonoff (1993)가 제안한 직접적 이상치 탐지 방법을 벌점 스플라인 회귀모형에 적용하여 이상치를 탐지하는 과정을 제안하며 모의실험과 실제 데이터에 적용을 통하여 스플라인 회귀모형, 강건 벌점 스플라인 회귀모형과 효율성을 비교한다.

Penalized Likelihood Regression: Fast Computation and Direct Cross-Validation

  • Kim, Young-Ju;Gu, Chong
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2005년도 춘계 학술발표회 논문집
    • /
    • pp.215-219
    • /
    • 2005
  • We consider penalized likelihood regression with exponential family responses. Parallel to recent development in Gaussian regression, the fast computation through asymptotically efficient low-dimensional approximations is explored, yielding algorithm that scales much better than the O($n^3$) algorithm for the exact solution. Also customizations of the direct cross-validation strategy for smoothing parameter selection in various distribution families are explored and evaluated.

  • PDF

Variable Selection Via Penalized Regression

  • Yoon, Young-Joo;Song, Moon-Sup
    • Communications for Statistical Applications and Methods
    • /
    • 제12권3호
    • /
    • pp.615-624
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

VARIABLE SELECTION VIA PENALIZED REGRESSION

  • Yoon, Young-Joo;Song, Moon-Sup
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2005년도 춘계 학술발표회 논문집
    • /
    • pp.7-12
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

  • PDF

Computation and Smoothing Parameter Selection In Penalized Likelihood Regression

  • Kim Young-Ju
    • Communications for Statistical Applications and Methods
    • /
    • 제12권3호
    • /
    • pp.743-758
    • /
    • 2005
  • This paper consider penalized likelihood regression with data from exponential family. The fast computation method applied to Gaussian data(Kim and Gu, 2004) is extended to non Gaussian data through asymptotically efficient low dimensional approximations and corresponding algorithm is proposed. Also smoothing parameter selection is explored for various exponential families, which extends the existing cross validation method of Xiang and Wahba evaluated only with Bernoulli data.

벌점-최소제곱법을 이용한 다중 변화점 탐색 (Detection of multiple change points using penalized least square methods: a comparative study between ℓ0 and ℓ1 penalty)

  • 손원;임요한;유동현
    • 응용통계연구
    • /
    • 제29권6호
    • /
    • pp.1147-1154
    • /
    • 2016
  • 본 연구에서는 다중 변화점 탐색과 관련하여 최근 많은 관심을 받고 있는 ${\ell}_0$-벌점 최소제곱법과 fused-라쏘-회귀(fused lasso regression; FLR)방법을 모의 실험을 통하여 비교하였다. 모의 실험의 결과로 FLR방법은 비-변화점을 변화점으로 잘못 탐색하는 경향이 ${\ell}_0$-벌점 최소제곱법과 비교할 때 상대적으로 높게 나타났으며 ${\ell}_0$-벌점 최소제곱법이 전반적으로 FLR방법에 비하여 좋은 성능을 보였다. 더불어 ${\ell}_0$-벌점 최소제곱법은 동적프로그래밍을 통하여 FLR 방법과 유사하게 효율적인 계산이 가능하다.