DOI QR코드

DOI QR Code

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution

역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정

  • Choi, Byung-Jin (Department of Applied Information Statistics, Kyonggi University)
  • 최병진 (경기대학교 응용정보통계학과)
  • Received : 20110200
  • Accepted : 20110300
  • Published : 2011.04.30

Abstract

This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

이 논문에서는 역가우스분포의 적합을 위한 변형된 엔트로피 기반 검정을 제시한다. 이 검정은 자료생성분포와 역가우스분포의 엔트로피 차이에 기초를 두고 있으며 검정통계량은 엔트로피 차이의 추정량을 사용한다. 엔트로피 차이의 추정량은 자료생성분포에 대한 엔트로피 추정량으로 Vasicek의 표본엔트로피와 역가우스분포에 대한 엔트로피 추정량로 균일최소분산불편추정량을 사용하여 얻는다. 모의실험을 통해 얻은 표본크기와 윈도크기에 따른 검정통계량의 기각값들을 표의 형태로 제공한다. 제안한 검정의 검정력 알아보기 위해 여러 대립분포와 표본크기에 대해서 모의실험을 수행하고 기존의 엔트로피 기반 검정과 비교한다.

Keywords

References

  1. Ahmed, N. A. and Gokhale, D. V. (1989). Entropy expressions and their estimators for multivariate distributions, IEEE Transactions on Information Theory, 35, 688–692. https://doi.org/10.1109/18.30996
  2. Chhikara, R. S. and Folks, J. L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Marcel Dekker, New York.
  3. Choi, B. (2006). Minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$, The Korean Communications in Statistics, 13, 657–667. https://doi.org/10.5351/CKSS.2006.13.3.657
  4. Choi, B. and Kim, K. (2006). Testing goodness-of-fit for Laplace distribution based on maximum entropy, Statistics, 40, 517–531. https://doi.org/10.1080/02331880600822473
  5. Cressie, N. (1976). On the logarithms of high-order spacings, Biometrika, 63, 343–355. https://doi.org/10.1093/biomet/63.2.343
  6. Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based test for uniformity, Journal of the American Statistical Association, 76, 967–974. https://doi.org/10.2307/2287597
  7. Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352–355. https://doi.org/10.1109/24.103017
  8. Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1988). A modified Kolmogorov Smirnov test for the inverse density with unknown parameters, Communications in Statistics-Simulation and Computation, 17, 1203–1212. https://doi.org/10.1080/03610918808812721
  9. Edgeman, R. L., Scott, R. C. and Pavur, R. J. (1992). Quadratic statistics for the goodness-of-fit test for the inverse Gaussian distribution, IEEE Transactions on Reliability, 41, 118–123. https://doi.org/10.1109/24.126682
  10. Gradsbteyn, I. S. and Pyzbik, I. M. (2000). Table of Integrals, Series, and Products, Academic Press, San Diego.
  11. Grzegorzewski, P. and Wieczorkowski, P. (1999). Entropy-based test goodness of-fit test for exponentiality, Communications in Statistics-Theory and Methods, 28, 1183–1202. https://doi.org/10.1080/03610929908832351
  12. Kapur, J. N. and Kesavan, H. K. (1992). Entropy Optimization Principles with Applications, Academic Press, San Diego.
  13. Lieblein, J. and Zelen, M. (1956). Statistical investigation of the fatigue life of deep groove ball bearings, Journal of Research of the National Bureau of Standards, 57, 273–316. https://doi.org/10.6028/jres.057.033
  14. Michael, J. R., Schucany, W. R. and Hass, R. W. (1976). Generating random variables using transformation with multiple roots, The American Statistician, 30, 88–90. https://doi.org/10.2307/2683801
  15. Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211–221. https://doi.org/10.1016/S0378-3758(01)00099-4
  16. O'Reilly, F. J. and Rueda, R. (1992). Goodness of fit for the inverse Gaussian distribution, The Canadian Journal of Statistics, 20, 387–397. https://doi.org/10.2307/3315609
  17. Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications, Springer, New York.
  18. Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379–423, 623–656. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  19. van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings, Scandinavian Journal of Statistics, 19, 61–72.
  20. Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, B38, 54–59.