DOI QR코드

DOI QR Code

THE SYMMETRIZED LOG-DETERMINANT DIVERGENCE

  • SEJONG KIM (Department of Mathematics, Chungbuk National University) ;
  • VATSALKUMAR N. MER (Institute for Industrial and Applied Mathematics, Chungbuk National University)
  • Received : 2024.02.23
  • Accepted : 2024.07.07
  • Published : 2024.07.30

Abstract

We see fundamental properties of the log-determinant α-divergence including the convexity of weighted geometric mean and the reversed sub-additivity under tensor product. We introduce a symmetrized divergence and show its properties including the boundedness and monotonicity on parameters. Finally, we discuss the barycenter minimizing the weighted sum of symmetrized divergences.

Keywords

Acknowledgement

We thank to anonymous reviewers for valuable comments.

References

  1. T. Ando, Concavity of certain maps on positive definite matrices and applications to Hadamard products, Linear Algebra Appl. 26 (1979), 203-241. 
  2. R. Bhatia, Positive Definite Matrices, Princeton Series in Applied Mathematics, 2007. 
  3. Z. Chebbi and M. Moakher, Means of hermitian positive-definite matrices based on the log-determinant α-divergence function, Linear Algebra Appl. 436 (2012), 1872-1889. 
  4. John Duchi, Derivations for Linear Algebra and Optimization, Berkeley, California 3.1 (2007): 2325-5870. 
  5. R. Horn and C. Johnson, Matrix Analysis, Cambridge University Press, 2009. 
  6. S. Kim, J. Lawson and Y. Lim, The matrix geometric mean of parameterized, weighted arithmetic and harmonic means, Linear Algebra Appl. 435 (2011), 2114-2131. 
  7. S. Kullback and R. A. Leibler, On information and sufficiency, Ann. Math. Statistics 22 (1951), 79-86. 
  8. S. Kum and Y. Lim, A geometric mean of parameterized arithmetic and harmonic means of convex functions, Abstr. Appl. Anal. 15 (2012), Art. ID 836804. 
  9. V.N. Mer and S. Kim, New multivariable mean from nonlinear matrix equation associated to the harmonic mean, Acta Sci. Math. (Szeged) (2024). https://doi.org/10.1007/s44146-024-00132-y 
  10. Ignacio Montes, Neighbourhood models induced by the euclidean distance and the Kullback-Leibler divergence, Proc. Mach. Learn. Res. 215 (2023), 367-378. 
  11. Dunbiao Niu, Enbin Song, Zhi Li, Linxia Zhang, Ting Ma, Juping Gu and Qingjiang Shi, A marginal distributionally robust MMSE estimation for a multisensor system with Kullback-Leibler divergence constraints, IEEE Trans. Signal Process 71 (2023), 3772-3787. 
  12. F.J. Pinski, G. Simpson, A.M. Stuart and H. Weber, Kullback-Leibler approximation for probability measures on infinite dimensional spaces, SIAM J. Math. Anal. 47 (2015), 4091-4122. 
  13. S. Sra, A new metric on the manifold of kernel matrices with application to matrix geometric means, NIPS (2012), 144-152. 
  14. J. Watson, L. Nieto-Barajas and C. Holmes, Characterizing variation of nonparametric random probability measures using the Kullback-Leibler divergence, Statistics 51 (2017), 558-571. 
  15. F. Zhang, Matrix Theory: Basic Results and Techniques, 2nd edition, Springer, 2011.