거리 학습과 최근린 방법을 이용한 빅데이터 분석

  • Published : 2014.07.15

Abstract

Keywords

References

  1. AY. Ng and M.l. Jordan. On discriminative vs. generative classifiers: A comparison of logistic regression and naive Bayes. In Advances in Neural Information Processing Systems 14, pages 841-848, 2001.
  2. J.V Davis, B. Kulis, P. Jain, S. Sra, and I.S. Dhillon. Information-theoretic metric learning. In Proceedings of the 24th International Conference on Machine Learning, pages 209-216, 2007.
  3. A Globerson and S. Roweis. Metric learning by collapsing classes. In Advances in Neural Information Processing Systems 18, pages 451-458, 2006.
  4. J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov. Neighbourhood components analysis. In Advances in Neural Information Processing Systems 17, pages 513-520, 2005 .
  5. K. Weinberger, J. Blitzer, and L. Saul. Distance metric leaming for large margin nearest neighbor classification. In Advances in Neural Information Processing Systems 18, pages 1473-1480, 2006.
  6. T.M. Cover and P.E. Hart. Nearest Neighbor Pattern Classification. IEEE Transactions on Information Theory, IT-13(1):21-27, 1967.
  7. Y. Noh, B. Zhang, and D. D. Lee. Generative local metric learning for nearest neighbor classification. In Advances in Neural Information Processing Systems 23, pages 1822-1830, 2010 ..
  8. C. Shen, J. Kim, L. Wang, and A van den Hengel. Positive semidefinite metric learning with boosting. In Advances in Neural Information Processing Systems 22, pages 1651-1659, 2009.
  9. T. Jaakkola and D. Haussler. Exploiting generative models in discriminative classifiers. In Advances in Neural Information Processing Systems 11, pages 487-493,1998.
  10. N. Leonenko and L. Pronzato. Correction: A class of R'enyi information estimators for multidimensional densities. Annals of Statistics, 38:3837-3838, 2010. https://doi.org/10.1214/10-AOS773
  11. N. Leonenko, L. Pronzato, and V. Savani. A class of R'enyi information estimators for multidimensional densities. Annals of Statistics, 36:2153-2182, 2008. https://doi.org/10.1214/07-AOS539
  12. B. Poczos and J. Schneider. On the estimation of alpha-divergences. In Proceedings of the 15th International Conference on ArtifIcial Intelligence and Statistics (AISTATS), pages 609-617, 2011.
  13. Y. Noh, M. Sugiyama, S. Liu, M. C. du Plessis, F. C. Park, and D. D. Lee. Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence. In Proceedings of the 17th International Con- ference on Artificial Intelligence and Statistics (AISTATS), pages 669-677, 2014.
  14. X. Nguyen, M. J. Wainwright, and M. I. Jordan. Estimating divergence functionals and the likelihood ratio by penalized convex risk minimization. In Advances in Neural Information Processing Systems, pages 1089-1096, 2007.
  15. D. Garcia-Garcia, U. von Luxburg, and R. Santos-Rodriguez. Risk-based generalizations of f-divergences. In Proceedings of 28th International Conference on Machine Learning, pages 417-424, 2011.