Browse > Article
http://dx.doi.org/10.5351/CSAM.2015.22.1.041

Variable Selection with Nonconcave Penalty Function on Reduced-Rank Regression  

Jung, Sang Yong (Department of Statistics, Sungkyunkwan University)
Park, Chongsun (Department of Statistics, Sungkyunkwan University)
Publication Information
Communications for Statistical Applications and Methods / v.22, no.1, 2015 , pp. 41-54 More about this Journal
Abstract
In this article, we propose nonconcave penalties on a reduced-rank regression model to select variables and estimate coefficients simultaneously. We apply HARD (hard thresholding) and SCAD (smoothly clipped absolute deviation) symmetric penalty functions with singularities at the origin, and bounded by a constant to reduce bias. In our simulation study and real data analysis, the new method is compared with an existing variable selection method using $L_1$ penalty that exhibits competitive performance in prediction and variable selection. Instead of using only one type of penalty function, we use two or three penalty functions simultaneously and take advantages of various types of penalty functions together to select relevant predictors and estimation to improve the overall performance of model fitting.
Keywords
Group penalty; multivariate linear model; nonconcave penalty; reduced-rank regression; variable selection;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Antoniadis, A. (1997). Wavelets in statistics: A review (with discussion), Journal of the Italian Statistical Association, 6, 97-144.   DOI
2 Breiman, L. (1995). Better subset regression using the nonnegative garrote, Technometrics, 37, 373-384.   DOI   ScienceOn
3 Chen, L. and Huang, J. Z. (2012). Sparse reduced-rand regression for simultaneous dimension reduction and variable selection, Journal of the American Statistical Association, 107, 1533-1545.   DOI
4 Chun, H. and Keles, S. (2010). Sparse partial least squares regression for simultaneous dimension reduction and variable selection, Journal of the Royal Statistical Society, Series B, 72, 3-25.   DOI   ScienceOn
5 Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360.   DOI   ScienceOn
6 Friedman, J., Hastie, T., Hofling, H. and Tibshirani, R. (2007). Pathwise coordinate optimization, The Annals of Applied Statistics, 1, 302-332.   DOI
7 Gower, J. C. and Dijksterhuis, G. B. (2004). Procrustes Problems, Oxford University Press, New York.
8 Izenman, A. J. (1975). Reduced-rank regression for the multivariate linear model, Journal of Multivariate Analysis, 5, 248-264.   DOI   ScienceOn
9 Reinsel, G. C. and Velu, R. P. (1998). Multivariate Reduced-Rank Regression: Theory and Applications, Springer, New York.
10 Tibshirani, R. J. (1996). Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288.
11 Yee, T. W. and Hastie, T. J. (2003). Reduced-rank vector generailized linear models, Statistical Modeliing, 3, 15-41.   DOI
12 Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society, Series B, 68, 49-67.   DOI   ScienceOn