Browse > Article
http://dx.doi.org/10.5351/CKSS.2012.19.6.837

Power Investigation of the Entropy-Based Test of Fit for Inverse Gaussian Distribution by the Information Discrimination Index  

Choi, Byungjin (Department of Applied Information Statistics, Kyonggi University)
Publication Information
Communications for Statistical Applications and Methods / v.19, no.6, 2012 , pp. 837-847 More about this Journal
Abstract
Inverse Gaussian distribution is widely used in applications to analyze and model right-skewed data. To assess the appropriateness of the distribution prior to data analysis, Mudholkar and Tian (2002) proposed an entropy-based test of fit. The test is based on the entropy power fraction(EPF) index suggested by Gokhale (1983). The simulation results report that the power of the entropy-based test is superior compared to other goodness-of-fit tests; however, this observation is based on the small-scale simulation results on the standard exponential, Weibull W(1; 2) and lognormal LN(0:5; 1) distributions. A large-scale simulation should be performed against various alternative distributions to evaluate the power of the entropy-based test; however, the use of a theoretical method is more effective to investigate the powers. In this paper, utilizing the information discrimination(ID) index defined by Ehsan et al. (1995) as a mathematical tool, we scrutinize the power of the entropy-based test. The selected alternative distributions are the gamma, Weibull and lognormal distributions, which are widely used in data analysis as an alternative to inverse Gaussian distribution. The study results are provided and an illustrative example is analyzed.
Keywords
Inverse Gaussian distribution; entropy; entropy power fraction; information discrimination; test of fit; power;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Chhikara, R. S. and Folks, J. L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Marcel Dekker, New York.
2 Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352-355.   DOI   ScienceOn
3 Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1988). A modified Kolmogorov-Smirnov test for the inverse density with unknown parameters, Communications in Statistics-Simulation and Computation, 17, 1203-1212.   DOI   ScienceOn
4 Ehsan, S., Ebrahimi, N. and Habibullah, M. (1995). Information distinguishability with application to analysis of failure data, Journal of the American Statistical Association, 90, 657-668.   DOI   ScienceOn
5 Gokhale, D. V. (1983). On entropy-based goodness-of-fit tests, Computational Statistics and Data Analysis, 1, 157-165.   DOI   ScienceOn
6 Jaynes, E. T. (1957). Information theory and statistical mechanics, Physicasl Review, 106, 620-630.   DOI
7 Lawless, J. F. (1982). Statistical Models and Methods for Lifetime Data, John Wiley, New York.
8 Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211-221.   DOI   ScienceOn
9 Schr¨odinger, E. (1915). Zur theorie der fall und steigversuche an teilchen mit Brownscher bewegung, Physikalische Zeitschrift, 16, 289-295.
10 Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications, Springer, New York.
11 Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379-423, 623-656.   DOI
12 Smoluchowsky, M. V. (1915). Notizuber die berechning der Brownschen molkularbewegung bei des ehrenhaft-milikanchen versuchsanordnung, Physikalische Zeitschrift, 16, 318-321.
13 Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, Series B, 38, 54-59.
14 Tweedie, M. K. (1957a). Statistical properties of inverse Gaussian distributions-I, Annals of Mathematical Statistics, 28, 362-377.   DOI   ScienceOn
15 Tweedie, M. K. (1957b). Statistical properties of inverse Gaussian distributions-II, Annals of Mathematical Statistics, 28, 696-705.   DOI   ScienceOn