DOI QR코드

DOI QR Code

Fused inverse regression with multi-dimensional responses

  • Cho, Youyoung (Department of Statistics, Ewha Womans University) ;
  • Han, Hyoseon (Department of Statistics, Ewha Womans University) ;
  • Yoo, Jae Keun (Department of Statistics, Ewha Womans University)
  • Received : 2020.12.28
  • Accepted : 2021.02.05
  • Published : 2021.05.31

Abstract

A regression with multi-dimensional responses is quite common nowadays in the so-called big data era. In such regression, to relieve the curse of dimension due to high-dimension of responses, the dimension reduction of predictors is essential in analysis. Sufficient dimension reduction provides effective tools for the reduction, but there are few sufficient dimension reduction methodologies for multivariate regression. To fill this gap, we newly propose two fused slice-based inverse regression methods. The proposed approaches are robust to the numbers of clusters or slices and improve the estimation results over existing methods by fusing many kernel matrices. Numerical studies are presented and are compared with existing methods. Real data analysis confirms practical usefulness of the proposed methods.

Keywords

Acknowledgement

For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2019R1F1A1050715).

References

  1. Cook RD and Zhang X (2014). Fused estimators of the central subspace in sufficient dimension reduction, Journal of the American Statistical Association, 109, 815--827. https://doi.org/10.1080/01621459.2013.866563
  2. Lee K, Choi Y, Um H, and Yoo JK (2019). On fused dimension reduction in multivariate regression, Chemometrics and Intelligent Laboratory Systems, 193, 103828. https://doi.org/10.1016/j.chemolab.2019.103828
  3. Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  4. Setodji CM and Cook RD (2004). K-means inverse regression, Technometrics 46, 421-429. https://doi.org/10.1198/004017004000000437
  5. Yin X and Bura E (2006). Moment-based dimension reduction for multivariate response regression, Journal of Statistical Planning and Inference, 136, 3675-3688. https://doi.org/10.1016/j.jspi.2005.01.011
  6. Yoo C, Yoo Y, Um HY, and Yoo JK (2020). On hierarchical clustering in sufficient dimension reduction, Communications for Statistical Applications and Methods, 27, 431-443. https://doi.org/10.29220/CSAM.2020.27.4.431
  7. Yoo JK (2008). A Novel moment-based dimension reduction approach in multivariate regression, Computational Statistics and Data Analysis, 52, 3843-3851. https://doi.org/10.1016/j.csda.2008.01.004
  8. Yoo JK (2009). Iterative optimal sufficient dimension reduction for the conditional mean in multivariate regression, Journal of Data Science, 7, 267-276.
  9. Yoo JK (2016a). Tutorial: Dimension reduction in regression with a notion of sufficiency, Communications for Statistical Applications and Methods, 23, 93-103. https://doi.org/10.5351/CSAM.2016.23.2.093
  10. Yoo JK (2016b). Tutorial: Methodologies for sufficient dimension reduction in regression, Communications for Statistical Applications and Methods, 23, 95-117.
  11. Yoo JK (2018). Basis-adaptive selection algorithm in dr-package, The R Journal, 10, 124-132. https://doi.org/10.32614/rj-2018-045
  12. Yoo JK and Cook RD (2007). Optimal sufficient dimension reduction for the conditional mean in multivariate regression, Biometrika, 94, 231-242. https://doi.org/10.1093/biomet/asm003
  13. Yoo JK, Lee K, and Woo S (2010). On the extension of sliced average variance estimation to multivariate regression, Statistical Methods and Applications, 19, 529-540. https://doi.org/10.1007/s10260-010-0145-9