Browse > Article
http://dx.doi.org/10.14317/jami.2016.295

A NEW EXPONENTIAL DIRECTED DIVERGENCE INFORMATION MEASURE  

JAIN, K.C. (Department of Mathematics, Malaviya National Institute of Technology)
CHHABRA, PRAPHULL (Department of Mathematics, Malaviya National Institute of Technology)
Publication Information
Journal of applied mathematics & informatics / v.34, no.3_4, 2016 , pp. 295-308 More about this Journal
Abstract
Depending upon the nature of the problem, different divergence measures are suitable. So it is always desirable to develop a new divergence measure. In the present work, new information divergence measure, which is exponential in nature, is introduced and characterized. Bounds of this new measure are obtained in terms of various symmetric and non- symmetric measures together with numerical verification by using two discrete distributions: Binomial and Poisson. Fuzzy information measure and Useful information measure corresponding to new exponential divergence measure are also introduced.
Keywords
New exponential divergence measure; Bounds; Numerical verification; Comparison of divergence measures;
Citations & Related Records
연도 인용수 순위
  • Reference
1 M.B. Bassat, f- Entropies, probability of error and feature selection, Inform. Control, 39 (1978), 227-242.   DOI
2 M. Basseville, Distance measures for signal processing and pattern recognition, Signal Processing, 18 (1989), 349-369.   DOI
3 A. Benveniste, M. Basseville and G. Moustakides, The asymptotic local approach to change detection and model validation, IEEE Trans. Automatic Control, AC-32 (1987), 583- 592.   DOI
4 L.M. Bregman, The relaxation method to find the common point of convex sets and its applications to the solution of problems in convex programming, USSR Comput. Math. Phys., 7 (1967), 200-217.   DOI
5 A. Bhattacharyya, On a measure of divergence between two multinomial populations, Sankhaya: The Indian Journal of Statistics (1933-1960), 7 (1946), 401- 406.
6 M. Salicru, Measures of information associated with Csiszar’s divergences, Kybernetika, 30 (1994), 563- 573.
7 R. Santos-Rodriguez, D. Garcia-Garcia and J. Cid-Sueiro, Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA'09), Miami Beach, Fl., USA, December 13-15, (2009), 551- 556.
8 R. Sibson, Information radius, Z. Wahrs. Undverw. Geb., 1 (1969), 149-160.   DOI
9 H.C. Taneja and R.K. Tuteja, Characterization of a quantitative- qualitative measure of inaccuracy, Kybernetika, 22 (1986), 393- 402.
10 I.J. Taneja, New developments in generalized information measures, Chapter in: Advances in Imaging and Electron Physics, Ed. P.W. Hawkes, 91 (1995), 37-135.
11 I.J. Taneja, Generalized symmetric divergence measures and inequalities, RGMIA Research Report Collection, http://rgmia.vu.edu.au, 7(2004), Art. 9. Available on-line at: arXiv:math.ST/0501301 v1 19 Jan 2005.
12 K. Tumer and J. Ghosh, Estimating the Bayes error rate through classifier combining, Proceedings of 13th International Conference on Pattern Recognition, (1996), 695-699.
13 B. Taskar, S. Lacoste-Julien and M.I. Jordan, Structured prediction, dual extra gradient and Bregman projections, Journal of Machine Learning Research, 7 (2006), 1627-1653.
14 D.S. Hooda, On generalized measures of fuzzy entropy, Mathematica Slovaca, 54 (2004), 315- 325.
15 J.S. Bhullar, O.P. Vinocha and M. Gupta, Generalized measure for two utility distributions, Proceedings of the World Congress on Engineering, 3 (2010).
16 H. Theil Statistical decomposition analysis, Amsterdam, North-Holland, 1972.
17 H. Theil, Economics and information theory, Amsterdam, North-Holland, 1967.
18 B. Vemuri, M. Liu, S. Amari and F. Nielsen, Total Bregman divergence and its applications to DTI analysis, IEEE Transactions on Medical Imaging, (2010).
19 D.V. Gokhale and S. Kullback, Information in contingency Tables, New York, Marcel Dekker, (1978).
20 K.C. Jain and P. Chhabra, New series of information divergence measures and their properties, Accepted in Applied Mathematics and Information Sciences.
21 K.C. Jain and R.N. Saraswat, Some new information inequalities and its applications in information theory, International Journal of Mathematics Research, 4 (2012), 295- 307.
22 K.C. Jain and A. Srivastava, On symmetric information divergence measures of Csiszar’s f- divergence class, Journal of Applied Mathematics, Statistics and Informatics, 3 (2007), 85-102.
23 H. Jeffreys, An invariant form for the prior probability in estimation problem, Proc. Roy. Soc. Lon. Ser. A, 186 (1946), 453-461.   DOI
24 P. Jha and V.K. Mishra, Some new trigonometric, hyperbolic and exponential measures of fuzzy entropy and fuzzy directed divergence, International Journal of Scientific and Engineering Research, 3 (2012), 1-5.   DOI
25 D. Kazakos and T. Cotsidas, A decision theory approach to the approximation of discrete probability densities, IEEE Trans. Perform. Anal. Machine Intell, 1 (1980), 61- 67.   DOI
26 A.N. Kolmogorov, On the approximation of distributions of sums of independent summands by infinitely divisible distributions, Sankhya, 25, 159-174.
27 L. Jones and C. Byrne, General entropy criteria for inverse problems with applications to data compression, pattern classification and cluster analysis, IEEE Trans. Inform. Theory, 36 (1990), 23-30.   DOI
28 T.T. Kadota and L.A. Shepp, On the best finite set of linear observables for discriminating two Gaussian signals, IEEE Trans. Inform. Theory, 13 (1967), 288-294.
29 R.K. Bajaj and D.S. Hooda, Generalized measures of fuzzy directed divergence, total ambiguity and information improvement, Journal of Applied Mathematics, Statistics and Informatics, 6 (2010), 31- 44.
30 T. Kailath, The divergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Comm. Technology, COM-15 (1967), 52-60.   DOI
31 F. Nielsen and S. Boltz, The Burbea-Rao and Bhattacharyya centroids, Apr. (2010), Arxiv.
32 S. Kullback and R.A. Leibler, On information and sufficiency, Ann. Math. Statist., 22 (1951), 79-86.   DOI
33 P. Kumar and A. Johnson, On a symmetric divergence measure and information inequalities, Journal of Inequalities in Pure and Applied Mathematics, 6 (2005), 1-13.
34 P.W. Lamberti, A.P. Majtey, A. Borras, M. Casas and A. Plastino, Metric character of the quantum Jensen- Shannon divergence, Physical Review A, 77 (2008), 052311.   DOI
35 M.A. Nielsen and I.L. Chuang, Quantum computation and information, Cambridge University Press, Cambridge, UK, 3 (2000), 9.
36 F. Osterreicher, Csiszar's f- divergence basic properties, Homepage: http://www.sbg.ac.at/mat/home.html, November 22, (2002).
37 A. Renyi, On measures of entropy and information, Proc. 4th Berkeley Symposium on Math. Statist. and Prob., 1 (1961), 547-561.
38 K. Pearson, On the Criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonable supposed to have arisen from random sampling, Phil. Mag., 50 (1900), 157-172.   DOI
39 E.C. Pielou, Ecological diversity, New York, Wiley, (1975).
40 H.V. Poor, Robust decision design using a distance criterion, IEEE Trans. Inf. Th., IT 26 (1980), 575- 587.   DOI
41 D.E. Boekee and J.C.A. Van Der Lubbe, Some aspects of error bounds in feature selection, Pattern Recognition, 11 (1979), 353- 360.   DOI
42 I. Csiszar, Information measures: A Critical survey, in Trans. In: Seventh Prague Conf. on Information Theory, Academia, Prague, (1974), 73-86.
43 J. Burbea and C.R. Rao, On the convexity of some divergence measures based on entropy functions, IEEE Trans. on Inform. Theory, IT-28 (1982), 489-495.   DOI
44 H.C. Chen, Statistical pattern recognition, Hoyderc Book Co., Rocelle Park, New York, (1973).
45 C.K. Chow and C.N. Lin, Approximating discrete probability distributions with dependence trees, IEEE Trans. Inform. Theory, 14 (1968), 462-467.   DOI
46 I. Csiszar, Information type measures of differences of probability distribution and indirect observations, Studia Math. Hungarica, 2 (1967), 299-318.
47 D. Dacunha- Castelle, Ecole dEte de Probabilites de, Saint-Flour VII-1977, Berlin, Heidelberg, New York: Springer, (1978).
48 S.S. Dragomir, A generalized f- divergence for probability vectors and applications, Research Report Collection 5 (2000).
49 E. Hellinger, Neue begrundung der theorie der quadratischen formen von unendlichen vielen veranderlichen, J. Rein.Aug. Math., 136 (1909), 210-271.
50 S.S. Dragomir, V. Gluscevic and C.E.M. Pearce, Approximation for the Csiszars f- divergence via midpoint inequalities, in Inequality Theory and Applications - Y.J. Cho, J.K. Kim, and S.S. Dragomir (Eds.), Nova Science Publishers, Inc., Huntington, New York, 1 (2001), 139-154.
51 S.S. Dragomir, J. Sunde and C. Buse, New inequalities for Jeffreys divergence measure, Tamusi Oxford Journal of Mathematical Sciences, 16 (2000), 295-309.