Browse > Article
http://dx.doi.org/10.29220/CSAM.2017.24.6.673

Penalized rank regression estimator with the smoothly clipped absolute deviation function  

Park, Jong-Tae (Department of Data Information, Pyeongtaek University)
Jung, Kang-Mo (Department of Statistics and Computer Science, Kunsan National University)
Publication Information
Communications for Statistical Applications and Methods / v.24, no.6, 2017 , pp. 673-683 More about this Journal
Abstract
The least absolute shrinkage and selection operator (LASSO) has been a popular regression estimator with simultaneous variable selection. However, LASSO does not have the oracle property and its robust version is needed in the case of heavy-tailed errors or serious outliers. We propose a robust penalized regression estimator which provide a simultaneous variable selection and estimator. It is based on the rank regression and the non-convex penalty function, the smoothly clipped absolute deviation (SCAD) function which has the oracle property. The proposed method combines the robustness of the rank regression and the oracle property of the SCAD penalty. We develop an efficient algorithm to compute the proposed estimator that includes a SCAD estimate based on the local linear approximation and the tuning parameter of the penalty function. Our estimate can be obtained by the least absolute deviation method. We used an optimal tuning parameter based on the Bayesian information criterion and the cross validation method. Numerical simulation shows that the proposed estimator is robust and effective to analyze contaminated data.
Keywords
local linear approximation; rank regression; robust methods; smoothly clipped absolute deviation; variable selection;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Alfons A, Croux C, and Gelper S (2013). Sparse least trimmed squares regression for analyzing high-dimensional large data sets, The Annals of Applied Statistics, 7, 226-248.   DOI
2 Chen X, Wang J, and McKeown MJ (2010). Asymptotic analysis of robust LASSOs in the presence of noise with large variance, IEEE Transactions on Information Theory, 56, 5131-5149.   DOI
3 Fan J and Li R (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360.   DOI
4 Hoerl AE and Kennard RW (1970). Ridge regression: biased estimation for nonorthogonal problems, Technometrics, 12, 55-67.   DOI
5 Jaeckel LA (1972). Estimating regression coefficients by minimizing the dispersion of the residuals, The Annals of Mathematical Statistics, 43, 1449-1458.   DOI
6 Jung KM (2011). Weighted least absolute deviation LASSO estimator, Communications for Statistical Applications and Methods, 18, 733-739.   DOI
7 Jung KM (2012). Weighted least absolute deviation regression estimator with the SCAD function, Journal of the Korean Data Analysis Society, 14, 2305-2312.
8 Jung KM (2013). Weighted support vector machines with the SCAD penalty, Communications for Statistical Applications and Methods, 20, 481-490.   DOI
9 Jung SY and Park C (2015). Variable selection with nonconcave penalty function on reduced-rank regression, Communications for Statistical Applications and Methods, 22, 41-54.   DOI
10 Kim HJ, Ollila E, and Koivunen V (2015). New robust LASSO method based on ranks. In Proceedings of the 23rd European Signal Processing Conference, Nice, France, 704-708.
11 Lee S (2015). An additive sparse penalty for variable selection in high-dimensional linear regression model, Communications for Statistical Applications and Methods, 22, 147-157.   DOI
12 Leng C, Lin Y, andWahba G (2006). A note on the LASSO and related procedures in model selection, Statistica Sinica, 16, 1273-1284.
13 Rousseeuw PJ and Leroy AM (1987). Robust Regression and Outlier Detection, John Wiley, New York.
14 Tibshirani R (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society Series B (Methodological), 58, 267-288.   DOI
15 Wang H, Li G, and Jiang G (2007). Robust regression shrinkage and consistent variable selection through the LAD-Lasso, Journal of Business & Economic Statistics, 25, 347-355.   DOI
16 Zou H and Li R (2008). One-step sparse estimates in nonconcave penalized likelihood models, Annals of Statistics, 36, 1509-1533.   DOI