DOI QR코드

DOI QR Code

More on directional regression

  • Kim, Kyongwon (Department of Statistics, Ewha Womans University) ;
  • Yoo, Jae Keun (Department of Statistics, Ewha Womans University)
  • Received : 2021.05.05
  • Accepted : 2021.05.24
  • Published : 2021.09.30

Abstract

Directional regression (DR; Li and Wang, 2007) is well-known as an exhaustive sufficient dimension reduction method, and performs well in complex regression models to have linear and nonlinear trends. However, the extension of DR is not well-done upto date, so we will extend DR to accommodate multivariate regression and large p-small n regression. We propose three versions of DR for multivariate regression and discuss how DR is applicable for the latter regression case. Numerical studies confirm that DR is robust to the number of clusters and the choice of hierarchical-clustering or pooled DR.

Keywords

Acknowledgement

For Kyongwon Kim, this work was supported by the Ewha Womans University Research Grant of 2021 and the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT) (No.2021R1F1A1046976) For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2021R1F1A1059844).

References

  1. Cook RD and Weisberg S (1991). Comment: Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 328-332.
  2. Cook RD, Li B, and Chiaromonte F (2007). Dimension reduction in regression without matrix inversion, Biometrika, 94, 569-584. https://doi.org/10.1093/biomet/asm038
  3. Hooper J (1959). Simultaneous equations and canonical correlation theory, Econometrika, 27, 245-256. https://doi.org/10.2307/1909445
  4. Li B and Wang S (2007). On directional regression for dimension reduction, Journal of the American Statistical Association, 102, 997-1008. https://doi.org/10.1198/016214507000000536
  5. Li B, Zha H, and Chiaromonte F (2005). Contour regression: a general approach to dimension reduction, The Annals of Statistics, 33, 1580-1616. https://doi.org/10.1214/009053605000000192
  6. Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  7. Setodji CM and Cook RD (2004). K-means inverse regression, Technometrics, 46, 421-429. https://doi.org/10.1198/004017004000000437
  8. Um HY, Won S, An H, and Yoo JK (2018). Case study: application of fused sliced average variance estimation to near-infrared spectroscopy of biscuit dough data, The Korean Journal of Applied Statistics, 31, 835-842. https://doi.org/10.5351/KJAS.2018.31.6.835
  9. Yoo JK (2016a). Tutorial: Dimension reduction in regression with a notion of sufficiency, Communications for Statistical Applications and Methods, 23, 93-103. https://doi.org/10.5351/CSAM.2016.23.2.093
  10. Yoo JK (2016b). Tutorial: Methodologies for sufficient dimension reduction in regression, Communications for Statistical Applications and Methods, 23, 95-117.
  11. Yoo C, Yoo Y, Um HY, and Yoo JK (2020). On hierarchical clustering in sufficient dimension reduction, Communications for Statistical Applications and Methods, 27, 431-443. https://doi.org/10.29220/CSAM.2020.27.4.431
  12. Yoo JK, Lee K, and Woo S (2010). On the extension of sliced average variance estimation to multi-variate regression, Statistical Methods and Applications, 19, 529-540. https://doi.org/10.1007/s10260-010-0145-9