DOI QR코드

DOI QR Code

Two variations of cross-distance selection algorithm in hybrid sufficient dimension reduction

  • Jae Keun Yoo (Department of Statistics, Ewha Womans University)
  • 투고 : 2022.09.22
  • 심사 : 2022.10.30
  • 발행 : 2023.03.31

초록

Hybrid sufficient dimension reduction (SDR) methods to a weighted mean of kernel matrices of two different SDR methods by Ye and Weiss (2003) require heavy computation and time consumption due to bootstrapping. To avoid this, Park et al. (2022) recently develop the so-called cross-distance selection (CDS) algorithm. In this paper, two variations of the original CDS algorithm are proposed depending on how well and equally the covk-SAVE is treated in the selection procedure. In one variation, which is called the larger CDS algorithm, the covk-SAVE is equally and fairly utilized with the other two candiates of SIR-SAVE and covk-DR. But, for the final selection, a random selection should be necessary. On the other hand, SIR-SAVE and covk-DR are utilized with completely ruling covk-SAVE out, which is called the smaller CDS algorithm. Numerical studies confirm that the original CDS algorithm is better than or compete quite well to the two proposed variations. A real data example is presented to compare and interpret the decisions by the three CDS algorithms in practice.

키워드

과제정보

For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2021R1F1A1059844).

참고문헌

  1. Cook RD (1998a). Regression Graphics, Wiley New York, New York.
  2. Cook RD (1998b). Principal hessian directions revisited, Journal of the American Statistical Association, 93, 84-94. https://doi.org/10.1080/01621459.1998.10474090
  3. Cook RD and Weisberg S (1991). Discussion of "sliced inverse regression for dimension reduction" by Li KC, Journal of the American Statistical Association, 86, 328-332. https://doi.org/10.2307/2290564
  4. Hooper JW (1959). Simultaneous equations and canonical correlation theory, Econometrika, 27, 245-256. https://doi.org/10.2307/1909445
  5. Li B and Wang S (2007). On directional regression for dimension reduction, Journal of the American Statistical Association, 102, 997-1008. https://doi.org/10.1198/016214507000000536
  6. Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  7. Li KC (1992). On principal hessian directions for data visualization and dimension reduction: Another application of Stein's lemma, Journal of the American Statistical Association, 87, 1025-1039. https://doi.org/10.1080/01621459.1992.10476258
  8. Park Y, Kim K, and Yoo, JK (2022). On cross-distance selection algorithm for hybrid sufficient dimension reduction, Computational Statistics and Data Analysis, 176, 1075627.
  9. Ye Z and Weiss RE (2003). Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association, 98, 968-979. https://doi.org/10.1198/016214503000000927
  10. Yin X and Cook RD (2002). Dimension reduction for the conditional k th moment in regression, Journal of the Royal Statistical Society: Series B, 64, 159-175. https://doi.org/10.1111/1467-9868.00330
  11. Yoo JK (2009). Partial moment-based sufficient dimension reduction, Statistics and Probability Letters, 79, 450-456. https://doi.org/10.1016/j.spl.2008.09.024
  12. Yoo JK (2018). Basis-adaptive selection algorithm in dr-package, The R Journal, 10, 124-132. https://doi.org/10.32614/RJ-2018-045