Browse > Article
http://dx.doi.org/10.5351/KJAS.2016.29.1.027

Adaptive lasso in sparse vector autoregressive models  

Lee, Sl Gi (Department of Statistics, Sungkyunkwan University)
Baek, Changryong (Department of Statistics, Sungkyunkwan University)
Publication Information
The Korean Journal of Applied Statistics / v.29, no.1, 2016 , pp. 27-39 More about this Journal
Abstract
This paper considers variable selection in the sparse vector autoregressive (sVAR) model where sparsity comes from setting small coefficients to exact zeros. In the estimation perspective, Davis et al. (2015) showed that the lasso type of regularization method is successful because it provides a simultaneous variable selection and parameter estimation even for time series data. However, their simulations study reports that the regular lasso overestimates the number of non-zero coefficients, hence its finite sample performance needs improvements. In this article, we show that the adaptive lasso significantly improves the performance where the adaptive lasso finds the sparsity patterns superior to the regular lasso. Some tuning parameter selections in the adaptive lasso are also discussed from the simulations study.
Keywords
sparse vector autoregressive model; adaptive lasso; high dimensional time series;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Arnold, A., Liu, Y., and Abe, N. (2008). Temporal causal modeling with graphical Granger methods, In Proceedings of the 13th ACM SIGKDD International Conference of Knowledge Discovery and Data Mining.
2 Cule, E., De Iorio, M. (2013). Ridge regression in prediction problems: automatic choice of the ridge parameter, Genetic Epidemiology, 37, 704-714.   DOI
3 Identification of synaptic connections in neural ensembles by graphical models, Journal of Neuroscience Methods, 77, 93-107.   DOI
4 Davis, R. A., Zang, P., and Zheng, T. (2015). Sparse vector autoregressive modeling, arXiv:1207.0520. Econometrica, 37, 424-438.
5 Hastie, T., Tibshirani, R., Wainwright, M. (2015). Statistical Learning with Sparsity: The Lasso and Generalizations, CRC press.
6 Huang, J., Ma, S., and Zhang, C.-H. (2008). Adaptive lasso for sparse high-dimensional regression models, Statistica Sincia, 18, 1608-1618.
7 Hsu, N.-J., Hung, H.-L., and Chang, Y.-M. (2008). Subset selection for vector autoregressive processes using lasso, Computational Statistics & Data Analysis, 52, 3645-3657.   DOI
8 Lozano, A. C., Abe, N., Liu, Y., and Rosset, S. (2009). Grouped graphical Granger modeling for gene expression regulatory networks discovery, Bioinformatics, 25, 110-118.   DOI
9 Lutkepohl, H. (2005). New Introduction to Multiple Time Series Analysis, Springer-Verlag, Berlin.
10 Song, S. and Bickel, P. J. (2011). Large vector auto regressions, arXiv:1106.3915.
11 Sims, C. A. (1980). Macroeconomics and reality, Econometrica: Journal of the Econometric Society, 1-48.
12 Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288.
13 Zhang, J., Jeng, X. J., and Liu, H. (2008). Some Two-Step Procedures for Variable Selection in High-Dimensional Linear Regression, arXiv:0810.1644.
14 Zou, H. (2006). Adaptive lasso and its oracle properties, Journal of American Statistical Association, 101, 1418-1429.   DOI