DOI QR코드

DOI QR Code

NEW INFORMATION INEQUALITIES ON ABSOLUTE VALUE OF THE FUNCTIONS AND ITS APPLICATION

  • 투고 : 2016.12.22
  • 심사 : 2017.03.25
  • 발행 : 2017.05.30

초록

Jain and Saraswat (2012) introduced new generalized f-information divergence measure, by which we obtained many well known and new information divergences. In this work, we introduce new information inequalities in absolute form on this new generalized divergence by considering convex normalized functions. Further, we apply these inequalities for getting new relations among well known divergences, together with numerical verification. Application to the Mutual information is also presented. Asymptotic approximation in terms of Chi- square divergence is done as well.

키워드

참고문헌

  1. M.B. Bassat, f-Entropies, probability of error and feature selection, Inform. Control 39 (1978), 227-242. https://doi.org/10.1016/S0019-9958(78)90587-9
  2. L.M. Bregman, The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys. 7 (1967), 200-217.
  3. J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory IT-28 (1982), 489-495.
  4. H.C. Chen, Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York, (1973).
  5. C.K. Chow and C.N. Lin, Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), no 3, 462-467. https://doi.org/10.1109/TIT.1968.1054142
  6. I. Csiszar, Information measures: A Critical survey, in Trans. In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.
  7. I. Csiszar, Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299- 318.
  8. D. Dacunha-Castelle, Ecole dEte de Probabilites de, Saint-Flour VII-1977, Berlin, Heidelberg, New York: Springer, (1978).
  9. S.S. Dragomir, V. Gluscevic, and C.E.M. Pearce, Approximation for the Csiszars f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.
  10. D.V. Gokhale and S. Kullback, Information in contingency Tables, New York, Marcel Dekker, (1978).
  11. K.C. Jain and P. Chhabra, New information inequalities in terms of Chi- square divergence and its application, International Bulletin of Mathematical Research 1, no. 1, pp: 37- 48.
  12. K.C. Jain and P. Chhabra, New information inequalities in terms of Variational distance and its application, Journal of New Results in Science 11 (2016), 30-40.
  13. K.C. Jain and P. Chhabra, Series of new information divergences, properties and corresponding series of metric spaces, International Journal of Innovative Research in Science, Engineering and Technology 3 (2014), 12124- 12132.
  14. K.C. Jain and R.N. Saraswat, Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4, no.3 (2012), 295-307.
  15. L. Jones and C. Byrne, General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory 36 (1990), 23-30. https://doi.org/10.1109/18.50370
  16. T.T. Kadota and L.A. Shepp, On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory 13 (1967), 288-294.
  17. T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology COM-15 (1967), 52-60.
  18. D. Kazakos and T. Cotsidas, A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell 1 (1980), 61- 67.
  19. A.N. Kolmogorov, On the approximation of distributions of sums of independent summands by infinitely divisible distributions, Sankhya 25, 159-174.
  20. S. Kullback and R.A. Leibler, On information and sufficiency, Ann. Math. Statist. 22 (1951), 79-86. https://doi.org/10.1214/aoms/1177729694
  21. F. Nielsen and S. Boltz, The Burbea-Rao and Bhattacharyya centroids, Apr. (2010), Arxiv.
  22. K. Pearson, On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag. 50 (1900), 157-172. https://doi.org/10.1080/14786440009463897
  23. E.C. Pielou, Ecological diversity, New York, Wiley, (1975).
  24. A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob. 1 (1961), 547-561.
  25. R. Santos-Rodriguez, D. Garcia-Garcia, and J. Cid-Sueiro, Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), 551-556.
  26. C.E. Shannon, A mathematical theory of communication, Bull. Sept. Tech. J. 27 (1948), 370-423 and 623-656.
  27. R. Sibson, Information radius, Z. Wahrs. Undverw. Geb. 1 (1969), 149-160.
  28. I.J. Taneja, New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes 91 (1995), 37-135.
  29. B. Taskar, S. Lacoste-Julien, and M.I. Jordan, Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research 7 (2006), 1627-1653.
  30. H. Theil Statistical decomposition analysis, Amsterdam, North-Holland, 1972.
  31. H. Theil, Economics and information theory, Amsterdam, North-Holland, 1967.
  32. B. Vemuri, M. Liu, S. Amari, and F. Nielsen, Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, 2010.