DOI QR코드

DOI QR Code

확률적분변환에 기초한 역가우스분포에 대한 적합도 검정

A Test of Fit for Inverse Gaussian Distribution Based on the Probability Integration Transformation

  • 최병진 (경기대학교 응용정보통계학과)
  • Choi, Byungjin (Department of Applied Information Statistics, Kyonggi University)
  • 투고 : 2013.04.11
  • 심사 : 2013.08.06
  • 발행 : 2013.08.31

초록

Mudholkar와 Tian (2002)이 제시한 엔트로피 기반 검정은 위치모수와 척도모수가 모두 알려져 있지 않거나 척도 모수만 알려져 있는 역가우스분포의 적합을 알아보고자 하는 경우에만 사용이 가능하다. 본 논문에서는 위치모수와 척도모수가 모두 알려져 있거나 위치모수만 알려져 있는 역가우스분포의 적합에도 적용할 수 있는 엔트로피 기반 적합도 검정을 소개한다. 이 검정은 확률적분변환에 기초를 두고 있다. 모의실험을 통해서 추정한 표본크기와 윈도크기에 따른 검정통계량의 기각값과 근사기각값을 얻기 위한 계산공식을 제시한다. 제안한 검정과 Mudholkar와 Tian (2002)의 검정을 검정력 측면에서의 성능을 비교하고자 모의실험을 수행한다. 모의실험 결과에서 제안한 검정은 기존의 엔트로피 기반 검정보다 더 좋은 검정력을 가지는 것으로 나타난다.

Mudholkar and Tian (2002) proposed an entropy-based test of fit for the inverse Gaussian distribution; however, the test can be applied to only the composite hypothesis of the inverse Gaussian distribution with an unknown location parameter. In this paper, we propose an entropy-based goodness-of-fit test for an inverse Gaussian distribution that can be applied to the composite hypothesis of the inverse Gaussian distribution as well as the simple hypothesis of the inverse Gaussian distribution with a specified location parameter. The proposed test is based on the probability integration transformation. The critical values of the test statistic estimated by simulations are presented in a tabular form. A simulation study is performed to compare the proposed test under some selected alternatives with Mudholkar and Tian (2002)'s test in terms of power. The results show that the proposed test has better power than the previous entropy-based test.

키워드

참고문헌

  1. Correa, J. C. (1995). A new estimator of entropy, Communications in Statistics-Theory and Methods, 24, 2439-2449. https://doi.org/10.1080/03610929508831626
  2. Cressie, N. (1976). On the logarithms of high-order spacings, Biometrika, 63, 343-355. https://doi.org/10.1093/biomet/63.2.343
  3. D'Agostino, R. B. and Stephens, M. A. (1986). Goodness-of-fit Techniques, Marcel Dekker, New York.
  4. Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based test for uniformity, Journal of the American Statistical Association, 76, 967-974. https://doi.org/10.1080/01621459.1981.10477750
  5. Dudewicz, E. J. and van der Meulen, E. C. (1987). New Perspectives in Theoretical and Applied Statistics, Wiley, New York.
  6. Ebrahimi, N., P ughoeft, K. and Soofi, E. S. (1994). Two measures of sample entropy, Statistics and probability Letters, 20, 225-234. https://doi.org/10.1016/0167-7152(94)90046-9
  7. Edgeman, R. L. (1990). Assessing the inverse Gaussian distribution assumption, IEEE Transactions on Reliability, 39, 352-355. https://doi.org/10.1109/24.103017
  8. Gokhale, D. V. (1983). On the entropy-based goodness-of-fit tests, Computational Statistics and Data Analysis, 1, 157-165. https://doi.org/10.1016/0167-9473(83)90087-7
  9. Gyorfi, L. and van der Meulen, E. C. (1987). Density-free convergence properties of various estimators of entropy, Computational Statistics and Data Analysis, 5, 425-436. https://doi.org/10.1016/0167-9473(87)90065-X
  10. Gyorfi, L. and van der Meulen, E. C. (1990). An entropy estimate based on a kernel density estimation. In: Limits Theorems in Probability and Statistics, Colloquia Mathematica Societatis Janos Bolyai, 57, 229-240.
  11. Hall, P. (1984). Limit theorems for sums of general functions of m-spacings, Mathematical Statistics and Data Analysis, 1, 517-532.
  12. Hall, P. (1986). On powerful distributional tests on sample spacings, Journal of Multivariate Analysis, 19, 201-255. https://doi.org/10.1016/0047-259X(86)90027-8
  13. Jaynes, E. T. (1957). Information theory and statistical mechanics, Physical Review, 106, 620-630. https://doi.org/10.1103/PhysRev.106.620
  14. Michael, J. R., Schucany, W. R. and Hass, R. W. (1976). Generating random variables using transformation with multiple roots, The American Statistician, 30, 88-90.
  15. Mudholkar, G. S. and Tian, L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test, Journal of Statistical Planning and Inference, 102, 211-221. https://doi.org/10.1016/S0378-3758(01)00099-4
  16. Proschan, F. (1963). Theoretical explanation of observed decreasing failure rate, Technometrics, 5, 375-384. https://doi.org/10.1080/00401706.1963.10490105
  17. Shannon, C. E. (1948). A mathematical theory of communications, Bell System Technical Journal, 27, 379-423, 623-656. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  18. Shuster, J. J. (1968). On the inverse Gaussian distribution function, Journal of the American Statistical Association, 63, 1514-1516. https://doi.org/10.1080/01621459.1968.10480942
  19. van Es, B. (1992). Estimating functionals related to a density by a class of statistics based on spacings, Scandinavian Journal of Statistics, 19, 61-72.
  20. Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, Series B, 38, 54-59.