DOI QR코드

DOI QR Code

A selective review of nonlinear sufficient dimension reduction

  • Sehun Jang (Department of Statistics, Korea University) ;
  • Jun Song (Department of Statistics, Korea University)
  • Received : 2024.01.21
  • Accepted : 2024.01.30
  • Published : 2024.03.31

Abstract

In this paper, we explore nonlinear sufficient dimension reduction (SDR) methods, with a primary focus on establishing a foundational framework that integrates various nonlinear SDR methods. We illustrate the generalized sliced inverse regression (GSIR) and the generalized sliced average variance estimation (GSAVE) which are fitted by the framework. Further, we delve into nonlinear extensions of inverse moments through the kernel trick, specifically examining the kernel sliced inverse regression (KSIR) and kernel canonical correlation analysis (KCCA), and explore their relationships within the established framework. We also briefly explain the nonlinear SDR for functional data. In addition, we present practical aspects such as algorithmic implementations. This paper concludes with remarks on the dimensionality problem of the target function class.

Keywords

Acknowledgement

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2022R1C1C1003647, No. RS-2023-00219212, and No. 2022M3J6A1063595) and a Korea University Grant (K2312771).

References

  1. Aronszajn N (1950). Theory of reproducing kernels, Transactions of the American Mathematical Society, 68, 337-404. https://doi.org/10.1090/S0002-9947-1950-0051437-7
  2. Baker CR (1973). Joint measures and cross-covariance operators, Transactions of the American Mathematical Society, 186, 273-289. https://doi.org/10.2307/1996566
  3. Berlinet A and Thomas-Agnan C (2011). Reproducing Kernel Hilbert Spaces in Probability and Statistics, Kluwer Academic Publishers, Boston, MA.
  4. Cook RD (1998). Regression Graphics, Wiley, New York.
  5. Cook RD and Weisberg S (1991). Sliced inverse regression for dimension reduction: Comment, Journal of the American Statistical Association, 86, 328-332. https://doi.org/10.1080/01621459.1991.10475036
  6. Fukumizu K, Bach FR, and Gretton A (2007). Statistical consistency of Kernel canonical correlation analysis, Journal of Machine Learning Research, 8, 361-383.
  7. Lee K-Y, Li B, and Chiaromonte F (2013). A general theory for nonlinear sufficient dimension reduction: Formulation and estimation, The Annals of Statistics, 41, 221-249. https://doi.org/10.1214/12-AOS1071
  8. Li B (2018). Sufficient Dimension Reduction: Methods and Applications with R, CRC Press, Boca Raton, FL.
  9. Li B and Song J (2017). Nonlinear sufficient dimension reduction for functional data, The Annals of Statistics, 45, 1059-1095. https://doi.org/10.1214/16-AOS1475
  10. Li K-C (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  11. Scholkopf B and Smola AJ (2002). Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT press, Cambridge, Mass.
  12. Song J, Kim K, and Yoo JK (2023). On a nonlinear extension of the principal fitted component model, Computational Statistics and Data Analysis, 182, 107707.
  13. Wu H-M (2008). Kernel sliced inverse regression with applications to classification, Journal of Computational and Graphical Statistics, 17, 590-610. https://doi.org/10.1198/106186008X345161
  14. Yeh Y-R, Huang S-Y, and Lee Y-J (2008). Nonlinear dimension reduction with kernel sliced inverse regression, IEEE Transactions on Knowledge and Data Engineering, 21, 1590-1603. https://doi.org/10.1109/TKDE.2008.232