References
- AY. Ng and M.l. Jordan. On discriminative vs. generative classifiers: A comparison of logistic regression and naive Bayes. In Advances in Neural Information Processing Systems 14, pages 841-848, 2001.
- J.V Davis, B. Kulis, P. Jain, S. Sra, and I.S. Dhillon. Information-theoretic metric learning. In Proceedings of the 24th International Conference on Machine Learning, pages 209-216, 2007.
- A Globerson and S. Roweis. Metric learning by collapsing classes. In Advances in Neural Information Processing Systems 18, pages 451-458, 2006.
- J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov. Neighbourhood components analysis. In Advances in Neural Information Processing Systems 17, pages 513-520, 2005 .
- K. Weinberger, J. Blitzer, and L. Saul. Distance metric leaming for large margin nearest neighbor classification. In Advances in Neural Information Processing Systems 18, pages 1473-1480, 2006.
- T.M. Cover and P.E. Hart. Nearest Neighbor Pattern Classification. IEEE Transactions on Information Theory, IT-13(1):21-27, 1967.
- Y. Noh, B. Zhang, and D. D. Lee. Generative local metric learning for nearest neighbor classification. In Advances in Neural Information Processing Systems 23, pages 1822-1830, 2010 ..
- C. Shen, J. Kim, L. Wang, and A van den Hengel. Positive semidefinite metric learning with boosting. In Advances in Neural Information Processing Systems 22, pages 1651-1659, 2009.
- T. Jaakkola and D. Haussler. Exploiting generative models in discriminative classifiers. In Advances in Neural Information Processing Systems 11, pages 487-493,1998.
- N. Leonenko and L. Pronzato. Correction: A class of R'enyi information estimators for multidimensional densities. Annals of Statistics, 38:3837-3838, 2010. https://doi.org/10.1214/10-AOS773
- N. Leonenko, L. Pronzato, and V. Savani. A class of R'enyi information estimators for multidimensional densities. Annals of Statistics, 36:2153-2182, 2008. https://doi.org/10.1214/07-AOS539
- B. Poczos and J. Schneider. On the estimation of alpha-divergences. In Proceedings of the 15th International Conference on ArtifIcial Intelligence and Statistics (AISTATS), pages 609-617, 2011.
- Y. Noh, M. Sugiyama, S. Liu, M. C. du Plessis, F. C. Park, and D. D. Lee. Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence. In Proceedings of the 17th International Con- ference on Artificial Intelligence and Statistics (AISTATS), pages 669-677, 2014.
- X. Nguyen, M. J. Wainwright, and M. I. Jordan. Estimating divergence functionals and the likelihood ratio by penalized convex risk minimization. In Advances in Neural Information Processing Systems, pages 1089-1096, 2007.
- D. Garcia-Garcia, U. von Luxburg, and R. Santos-Rodriguez. Risk-based generalizations of f-divergences. In Proceedings of 28th International Conference on Machine Learning, pages 417-424, 2011.