DOI QR코드

DOI QR Code

Comprehensive studies of Grassmann manifold optimization and sequential candidate set algorithm in a principal fitted component model

  • Received : 2022.08.25
  • Accepted : 2022.09.06
  • Published : 2022.11.30

Abstract

In this paper we compare parameter estimation by Grassmann manifold optimization and sequential candidate set algorithm in a structured principal fitted component (PFC) model. The structured PFC model extends the form of the covariance matrix of a random error to relieve the limits that occur due to too simple form of the matrix. However, unlike other PFC models, structured PFC model does not have a closed form for parameter estimation in dimension reduction which signals the need of numerical computation. The numerical computation can be done through Grassmann manifold optimization and sequential candidate set algorithm. We conducted numerical studies to compare the two methods by computing the results of sequential dimension testing and trace correlation values where we can compare the performance in determining dimension and estimating the basis. We could conclude that Grassmann manifold optimization outperforms sequential candidate set algorithm in dimension determination, while sequential candidate set algorithm is better in basis estimation when conducting dimension reduction. We also applied the methods in real data which derived the same result.

Keywords

Acknowledgement

For Chaeyoung Lee and Jae Keun Yoo, this work was supported by the MSIT(Ministry of Science,ICT), Korea, under the High-Potential Individuals Global Training Program (RS-2022-00154879) supervised by the IITP(Institute for Information & Communications Technology Planning & Evaluation). For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2021R1F1A1059844).

References

  1. Adragni KP, Cook RD, and Wu S (2012). GrassmannOptim: An R package for Grassmann Manifold Optimization, Journal of Statistical Software, 50, 1-18.
  2. Cook RD (2007). Fisher lecture: Dimension reduction in regression, Statistical Science, 22, 1-26. https://doi.org/10.1214/088342306000000682
  3. Cook RD (2018). An Introduction to Envelopes: Dimension Reduction for Efficient Estimation in Multivariate Statistics, John Wiley & Sons Hoboken, New Jersey.
  4. Cook RD, Li B, and Chiaromonte F (2010). Envelope models for parsimonious and efficient multivariate linear regression, Statistica Sinica, 20, 927-1010.
  5. Cook RD and Forzani L (2009). Principal fitted components for dimension reduction in regression, Statistical Science, 485, 485-501.
  6. Enz R (1991). Prices and Earnings Around the Globe, Zurich, Switzerland: Union Bank of Switzerland.
  7. Gallivan KA, Srivastava A, Xiuwen L, and Van Dooren P (2003). Efficient algorithms for inferences on Grassmann manifolds, In Proceedings of IEEE Workshop on Statistical Signal Processing, 2003, 315-318.
  8. Yoo JK (2016). Tutorial: Methodologies for sufficient dimension reduction in regression, Communications for Statistical Applications and Methods, 23, 95-117.
  9. Yoo JK (2018). Response dimension reduction: Model-based approach, Statistics, 52, 409-425. https://doi.org/10.1080/02331888.2017.1410152