Browse > Article
http://dx.doi.org/10.5351/CSAM.2015.22.3.305

Generalized Partially Double-Index Model: Bootstrapping and Distinguishing Values  

Yoo, Jae Keun (Department of Statistics, Ewha Womans University)
Publication Information
Communications for Statistical Applications and Methods / v.22, no.3, 2015 , pp. 305-312 More about this Journal
Abstract
We extend a generalized partially linear single-index model and newly define a generalized partially double-index model (GPDIM). The philosophy of sufficient dimension reduction is adopted in GPDIM to estimate unknown coefficient vectors in the model. Subsequently, various combinations of popular sufficient dimension reduction methods are constructed with the best combination among many candidates determined through a bootstrapping procedure that measures distances between subspaces. Distinguishing values are newly defined to match the estimates to the corresponding population coefficient vectors. One of the strengths of the proposed model is that it can investigate the appropriateness of GPDIM over a single-index model. Various numerical studies confirm the proposed approach, and real data application are presented for illustration purposes.
Keywords
bootstrapping; central subspace; distinguishing value; generalized partially linear single-index model; kernel matrix; regression; sufficient dimension reduction;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Carroll, R. J., Fan, J., Gijbels, I. and Wand, M. P. (1997). Generalized partially linear single-index models, Journal of the American Statistical Association, 92, 477-489.   DOI   ScienceOn
2 Cook, R. D. (1998). Regression Graphics : Ideas for Studying Regressions through Graphics, John Wiley & Sons, New York.
3 Cook, R. D. and Weisberg, S. (1991). Comment: Sliced inverse regression for dimension reduction by Ker-Chau Li, Journal of the American Statistical Association, 86, 328-332.
4 Hooper, J. (1959). Simultaneous equations and canonical correlation theory, Econometrica, 27, 245-256.   DOI   ScienceOn
5 Li, K. C. (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327.   DOI   ScienceOn
6 Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction:Another application of Stein's lemma, Journal of the American Statistical Association, 87, 1025-1039.   DOI   ScienceOn
7 Shao, Y., Cook, R. D. and Weisberg, S. (2006). The linearity condition and adaptive estimation in single-index regressions, regressions, Available from: http://arxiv.org/pdf/1001.4802.pdf
8 Ye, Z. and Weiss, R. E. (2003). Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association, 98, 968-979.   DOI   ScienceOn
9 Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional kth moment in regression, Journal of Royal Statistical Society Series B, 64, 159-175.   DOI   ScienceOn
10 Yoo, J. K. (2008). A novel moment-based dimension reduction approach in multivariate regression, Computational Statistics and Data Analysis, 52, 3843-3851.   DOI   ScienceOn
11 Yoo, J. K. (2009). Partial moment-based sufficient dimension reduction, Statistics and Probability Letters, 79, 450-456.   DOI   ScienceOn
12 Yoo, J. K. (2010). Integrated partial sufficient dimension reduction with heavily unbalanced categorical predictors, The Korean Journal of Applied Statistics, 23, 977-985.   DOI   ScienceOn
13 Yoo, J. K. (2011). Unified predictor hypothesis tests in sufficient dimension reduction: A bootstrap approach, Computational Statistics and Data Analysis, 40, 217-225.
14 Yoo, J. K. (2013). Advances in seeded dimension reduction: Bootstrap criteria and extensions, Computational Statistics and Data Analysis, 60, 70-79.   DOI   ScienceOn