Browse > Article
http://dx.doi.org/10.5351/KJAS.2007.20.1.079

Improving a Test for Normality Based on Kullback-Leibler Discrimination Information  

Choi, Byung-Jin (Department of Applied Information Statistics, Kyonggi University)
Publication Information
The Korean Journal of Applied Statistics / v.20, no.1, 2007 , pp. 79-89 More about this Journal
Abstract
A test for normality introduced by Arizono and Ohta(1989) is based on fullback-Leibler discrimination information. The test statistic is derived from the discrimination information estimated using sample entropy of Vasicek(1976) and the maximum likelihood estimator of the variance. However, these estimators are biased and so it is reasonable to make use of unbiased estimators to accurately estimate the discrimination information. In this paper, Arizono-Ohta test for normality is improved. The derived test statistic is based on the bias-corrected entropy estimator and the uniformly minimum variance unbiased estimator of the variance. The properties of the improved KL test are investigated and Monte Carlo simulation is performed for power comparison.
Keywords
Normality; entropy; Kullback-Leibler discrimination information; power;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Vasicek, O. (1976). A test for normality based on sample entropy, Journal of the Royal Statistical Society, Ser. B, 38, 54-59
2 Arizono, I. and Ohta, H. (1989). A test for normality based on Kullback-Leibler information, The American Statistician, 43, 20-22   DOI
3 Chandra, M., De Wet, T. and Singpurwalla, N. D. (1982). On the sample redundancy and a test for exponentiality, Communications in Statistics-Theory and Methods, 11, 429--438   DOI   ScienceOn
4 D'Agostino, R. B. and Stephens, M. A. (1986). Goodness-of-fit Techniques, Marcel Dekker, New York
5 Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based tests of uniformity, Journal of the American Statistical Association, 76, 967-974   DOI
6 Ebrahimi, N., Habibullah, M. and Soofi, E. S. (1992). Testing exponentiality based on Kullback-Leibler information, Journal of the Royal Statistical Society, Ser. B, 54, 739-748
7 Gokhale, D. V. (1983). On entropy-based goodness-of-fit tests, Computational Statistics & Data Analysis, 1, 157-165   DOI   ScienceOn
8 Kim, J. T., Lee, W. D., Ko, J. H., Yoon, Y. H. and Kang, S. G. (1999). Goodness of fit test for normality based on Kullback-Leibler information, The Korean Communications in Statistics, 6, 909--917
9 Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79--86   DOI   ScienceOn
10 Seshadri, V. (1999). The Inverse Gaussian Distribution: Statistical Theory and Applications, Springer, New York
11 Shannon, C. E. (1948). A mathematical theory of communication, The Bell System Technical Journal, 27, 349-423, 623--656
12 김종태, 이우동 (1998). 쿨백-레이블러 정보함수에 기초한 와이블분포와 극단값 분포에 대한 적합도 검정, <용용통계연구>, 11, 351-362
13 Wieczorkowski, R. and Grzegorzewski, P.(1999). Entropy estimators-improvements and comparisons, Communications in Statistics-Simulation and Computation, 28, 541-567   DOI   ScienceOn