Acknowledgement
Supported by : National Research Foundation of Korea (NRF)
References
- Alfons A, Croux C, and Gelper S (2013). Sparse least trimmed squares regression for analyzing high-dimensional large data sets, The Annals of Applied Statistics, 7, 226-248. https://doi.org/10.1214/12-AOAS575
- Chen X, Wang J, and McKeown MJ (2010). Asymptotic analysis of robust LASSOs in the presence of noise with large variance, IEEE Transactions on Information Theory, 56, 5131-5149. https://doi.org/10.1109/TIT.2010.2059770
- Fan J and Li R (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360. https://doi.org/10.1198/016214501753382273
- Hoerl AE and Kennard RW (1970). Ridge regression: biased estimation for nonorthogonal problems, Technometrics, 12, 55-67. https://doi.org/10.1080/00401706.1970.10488634
- Jaeckel LA (1972). Estimating regression coefficients by minimizing the dispersion of the residuals, The Annals of Mathematical Statistics, 43, 1449-1458. https://doi.org/10.1214/aoms/1177692377
- Jung KM (2011). Weighted least absolute deviation LASSO estimator, Communications for Statistical Applications and Methods, 18, 733-739. https://doi.org/10.5351/CKSS.2011.18.6.733
- Jung KM (2012). Weighted least absolute deviation regression estimator with the SCAD function, Journal of the Korean Data Analysis Society, 14, 2305-2312.
- Jung KM (2013). Weighted support vector machines with the SCAD penalty, Communications for Statistical Applications and Methods, 20, 481-490. https://doi.org/10.5351/CSAM.2013.20.6.481
- Jung SY and Park C (2015). Variable selection with nonconcave penalty function on reduced-rank regression, Communications for Statistical Applications and Methods, 22, 41-54. https://doi.org/10.5351/CSAM.2015.22.1.041
- Kim HJ, Ollila E, and Koivunen V (2015). New robust LASSO method based on ranks. In Proceedings of the 23rd European Signal Processing Conference, Nice, France, 704-708.
- Lee S (2015). An additive sparse penalty for variable selection in high-dimensional linear regression model, Communications for Statistical Applications and Methods, 22, 147-157. https://doi.org/10.5351/CSAM.2015.22.2.147
- Leng C, Lin Y, andWahba G (2006). A note on the LASSO and related procedures in model selection, Statistica Sinica, 16, 1273-1284.
- Rousseeuw PJ and Leroy AM (1987). Robust Regression and Outlier Detection, John Wiley, New York.
- Tibshirani R (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society Series B (Methodological), 58, 267-288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x
- Wang H, Li G, and Jiang G (2007). Robust regression shrinkage and consistent variable selection through the LAD-Lasso, Journal of Business & Economic Statistics, 25, 347-355. https://doi.org/10.1198/073500106000000251
- Zou H and Li R (2008). One-step sparse estimates in nonconcave penalized likelihood models, Annals of Statistics, 36, 1509-1533. https://doi.org/10.1214/009053607000000802