Browse > Article
http://dx.doi.org/10.14317/jami.2017.371

NEW INFORMATION INEQUALITIES ON ABSOLUTE VALUE OF THE FUNCTIONS AND ITS APPLICATION  

CHHABRA, PRAPHULL (University of Engineering and Management)
Publication Information
Journal of applied mathematics & informatics / v.35, no.3_4, 2017 , pp. 371-385 More about this Journal
Abstract
Jain and Saraswat (2012) introduced new generalized f-information divergence measure, by which we obtained many well known and new information divergences. In this work, we introduce new information inequalities in absolute form on this new generalized divergence by considering convex normalized functions. Further, we apply these inequalities for getting new relations among well known divergences, together with numerical verification. Application to the Mutual information is also presented. Asymptotic approximation in terms of Chi- square divergence is done as well.
Keywords
New generalized f-divergence; New information inequalities; Convex and normalized function; Asymptotic approximation; Mutual information;
Citations & Related Records
연도 인용수 순위
  • Reference
1 K. Pearson, On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag. 50 (1900), 157-172.   DOI
2 E.C. Pielou, Ecological diversity, New York, Wiley, (1975).
3 A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob. 1 (1961), 547-561.
4 R. Santos-Rodriguez, D. Garcia-Garcia, and J. Cid-Sueiro, Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), 551-556.
5 C.E. Shannon, A mathematical theory of communication, Bull. Sept. Tech. J. 27 (1948), 370-423 and 623-656.
6 R. Sibson, Information radius, Z. Wahrs. Undverw. Geb. 1 (1969), 149-160.
7 I.J. Taneja, New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes 91 (1995), 37-135.
8 B. Taskar, S. Lacoste-Julien, and M.I. Jordan, Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research 7 (2006), 1627-1653.
9 H. Theil Statistical decomposition analysis, Amsterdam, North-Holland, 1972.
10 H. Theil, Economics and information theory, Amsterdam, North-Holland, 1967.
11 B. Vemuri, M. Liu, S. Amari, and F. Nielsen, Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, 2010.
12 C.K. Chow and C.N. Lin, Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), no 3, 462-467.   DOI
13 I. Csiszar, Information measures: A Critical survey, in Trans. In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.
14 I. Csiszar, Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299- 318.
15 D. Dacunha-Castelle, Ecole dEte de Probabilites de, Saint-Flour VII-1977, Berlin, Heidelberg, New York: Springer, (1978).
16 S.S. Dragomir, V. Gluscevic, and C.E.M. Pearce, Approximation for the Csiszars f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.
17 D.V. Gokhale and S. Kullback, Information in contingency Tables, New York, Marcel Dekker, (1978).
18 K.C. Jain and P. Chhabra, New information inequalities in terms of Chi- square divergence and its application, International Bulletin of Mathematical Research 1, no. 1, pp: 37- 48.
19 K.C. Jain and P. Chhabra, New information inequalities in terms of Variational distance and its application, Journal of New Results in Science 11 (2016), 30-40.
20 K.C. Jain and P. Chhabra, Series of new information divergences, properties and corresponding series of metric spaces, International Journal of Innovative Research in Science, Engineering and Technology 3 (2014), 12124- 12132.
21 K.C. Jain and R.N. Saraswat, Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4, no.3 (2012), 295-307.
22 L. Jones and C. Byrne, General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory 36 (1990), 23-30.   DOI
23 T.T. Kadota and L.A. Shepp, On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory 13 (1967), 288-294.
24 T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology COM-15 (1967), 52-60.
25 D. Kazakos and T. Cotsidas, A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell 1 (1980), 61- 67.
26 A.N. Kolmogorov, On the approximation of distributions of sums of independent summands by infinitely divisible distributions, Sankhya 25, 159-174.
27 S. Kullback and R.A. Leibler, On information and sufficiency, Ann. Math. Statist. 22 (1951), 79-86.   DOI
28 F. Nielsen and S. Boltz, The Burbea-Rao and Bhattacharyya centroids, Apr. (2010), Arxiv.
29 M.B. Bassat, f-Entropies, probability of error and feature selection, Inform. Control 39 (1978), 227-242.   DOI
30 L.M. Bregman, The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys. 7 (1967), 200-217.
31 J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory IT-28 (1982), 489-495.
32 H.C. Chen, Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York, (1973).