Browse > Article
http://dx.doi.org/10.7465/jkdi.2013.24.6.1497

On scaled cumulative residual Kullback-Leibler information  

Hwang, Insung (Department of Applied Statistics, Yonsei University)
Park, Sangun (Department of Applied Statistics, Yonsei University)
Publication Information
Journal of the Korean Data and Information Science Society / v.24, no.6, 2013 , pp. 1497-1501 More about this Journal
Abstract
Cumulative residual Kullback-Leibler (CRKL) information is well defined on the empirical distribution function (EDF) and allows us to construct a EDF-based goodness of t test statistic. However, we need to consider a scaled CRKL because CRKL is not scale invariant. In this paper, we consider several criterions for estimating the scale parameter in the scale CRKL and compare the performances of the estimated CRKL in terms of both power and unbiasedness.
Keywords
Empirical distribution; exponential distribution; goodness of fit test;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Balakrishnan, N., Rad, A. H. and Arghami, N. R. (2007). Testing exponentiality based on Kullback-Leibler information with progressively Type-II censored data. IEEE Transactions on Reliability, 56, 349-356.   DOI   ScienceOn
2 Barapour, S. and Rad, A. H. (2012). Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Communications in Statistics-Theory and Methods, 41, 1387-1396.   DOI
3 Park, S. (2012). Generalized Kullback-Leibler information and its extensions to censored and discrete cases. Journal of the Korean Data & Information Science Society, 23, 1223-1229.   과학기술학회마을   DOI   ScienceOn
4 Park, S. (2013). On censored cumulative residual Kullback-Leibler information and goodness-of-fit test with Type II censored data. Submitted to Statistical Papers (under 2nd revision).
5 Park, S., Rao, M. and Shin, D.W. (2012). On cumulative residual Kullback-Leibler information. Statistics and Probability Letters, 82, 2025-2032.   DOI   ScienceOn
6 Park, S. and Shin, M. (2013). Kullback-Leibler information of Type I censored variable and its application. To appear in Statistics.
7 Rao, M., Chen, Y., Vemuri, B.C. and Wang, F. (2004). Cumulative residual entropy: A new measure of information. IEEE Transactions on Information Theory, 50, 1220-1228.   DOI   ScienceOn
8 Soofi, E. S. (2000). Principal information theoretic approaches. Journal of the American Statistical Association, 95, 1349-1353.   DOI   ScienceOn