• Title/Summary/Keyword: entropy-based test

Search Result 67, Processing Time 0.018 seconds

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

ENTROPY-BASED GOODNESS OF FIT TEST FOR A COMPOSITE HYPOTHESIS

  • Lee, Sangyeol
    • Bulletin of the Korean Mathematical Society
    • /
    • v.53 no.2
    • /
    • pp.351-363
    • /
    • 2016
  • In this paper, we consider the entropy-based goodness of fit test (Vasicek's test) for a composite hypothesis. The test measures the discrepancy between the nonparametric entropy estimate and the parametric entropy estimate obtained from an assumed parametric family of distributions. It is shown that the proposed test is asymptotically normal under regularity conditions, but is affected by parameter estimates. As a remedy, a bootstrap version of Vasicek's test is proposed. Simulation results are provided for illustration.

Power Investigation of the Entropy-Based Test of Fit for Inverse Gaussian Distribution by the Information Discrimination Index

  • Choi, Byungjin
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.6
    • /
    • pp.837-847
    • /
    • 2012
  • Inverse Gaussian distribution is widely used in applications to analyze and model right-skewed data. To assess the appropriateness of the distribution prior to data analysis, Mudholkar and Tian (2002) proposed an entropy-based test of fit. The test is based on the entropy power fraction(EPF) index suggested by Gokhale (1983). The simulation results report that the power of the entropy-based test is superior compared to other goodness-of-fit tests; however, this observation is based on the small-scale simulation results on the standard exponential, Weibull W(1; 2) and lognormal LN(0:5; 1) distributions. A large-scale simulation should be performed against various alternative distributions to evaluate the power of the entropy-based test; however, the use of a theoretical method is more effective to investigate the powers. In this paper, utilizing the information discrimination(ID) index defined by Ehsan et al. (1995) as a mathematical tool, we scrutinize the power of the entropy-based test. The selected alternative distributions are the gamma, Weibull and lognormal distributions, which are widely used in data analysis as an alternative to inverse Gaussian distribution. The study results are provided and an illustrative example is analyzed.

ONLINE TEST BASED ON MUTUAL INFORMATION FOR TRUE RANDOM NUMBER GENERATORS

  • Kim, Young-Sik;Yeom, Yongjin;Choi, Hee Bong
    • Journal of the Korean Mathematical Society
    • /
    • v.50 no.4
    • /
    • pp.879-897
    • /
    • 2013
  • Shannon entropy is one of the widely used randomness measures especially for cryptographic applications. However, the conventional entropy tests are less sensitive to the inter-bit dependency in random samples. In this paper, we propose new online randomness test schemes for true random number generators (TRNGs) based on the mutual information between consecutive ${\kappa}$-bit output blocks for testing of inter-bit dependency in random samples. By estimating the block entropies of distinct lengths at the same time, it is possible to measure the mutual information, which is closely related to the amount of the statistical dependency between two consecutive data blocks. In addition, we propose a new estimation method for entropies, which accumulates intermediate values of the number of frequencies. The proposed method can estimate entropy with less samples than Maurer-Coron type entropy test can. By numerical simulations, it is shown that the new proposed scheme can be used as a reliable online entropy estimator for TRNGs used by cryptographic modules.

Testing Uniformity Based on Vasicek's Estimator

  • Kim, Jong-Tae;Cha, Young-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.119-127
    • /
    • 2004
  • To test uniformity of a population, we modify the test statistic based on the sample entropy in the literature, and establish its limiting distribution under weaker conditions, which improves the existing results. It is also to study the proposed test statistic based on Vasicek's entropy estimator is consistent.

  • PDF

Goodness-of-fit Tests for the Weibull Distribution Based on the Sample Entropy

  • Kang, Suk-Bok;Lee, Hwa-Jung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.259-268
    • /
    • 2006
  • For Type-II censored sample, we propose three modified entropy estimators based on the Vasieck's estimator, van Es' estimator, and Correa's estimator. We also propose the goodness-of-fit tests of the Weibull distribution based on the modified entropy estimators. We simulate the mean squared errors (MSE) of the proposed entropy estimators and the powers of the proposed tests. We also compare the proposed tests with the modified Kolmogorov-Smirnov and Cramer-von-Mises tests which were proposed by Kang et al. (2003).

  • PDF

Goodness-of-fit test for normal distribution based on parametric and nonparametric entropy estimators (모수적 엔트로피 추정량과 비모수적 엔트로피 추정량에 기초한 정규분포에 대한 적합도 검정)

  • Choi, Byungjin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.4
    • /
    • pp.847-856
    • /
    • 2013
  • In this paper, we deal with testing goodness-of-fit for normal distribution based on parametric and nonparametric entropy estimators. The minimum variance unbiased estimator for the entropy of the normal distribution is derived as a parametric entropy estimator to be used for the construction of a test statistic. For a nonparametric entropy estimator of a data-generating distribution under the alternative hypothesis sample entropy and its modifications are used. The critical values of the proposed tests are estimated by Monte Carlo simulations and presented in a tabular form. The performance of the proposed tests under some selected alternatives are investigated by means of simulations. The results report that the proposed tests have better power than the previous entropy-based test by Vasicek (1976). In applications, the new tests are expected to be used as a competitive tool for testing normality.

A Goodness of Fit Tests Based on the Partial Kullback-Leibler Information with the Type II Censored Data

  • Park, Sang-Un;Lim, Jong-Gun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.233-238
    • /
    • 2003
  • Goodness of fit test statistics based on the information discrepancy have been shown to perform very well (Vasicek 1976, Dudewicz and van der Meulen 1981, Chandra et al 1982, Gohkale 1983, Arizona and Ohta 1989, Ebrahimi et al 1992, etc). Although the test is well defined for the non-censored case, censored case has not been discussed in the literature. Therefore we consider a goodness of fit test based on the partial Kullback-Leibler(KL) information with the type II censored data. We derive the partial KL information of the null distribution function and a nonparametric distribution function, and establish a goodness of fit test statistic. We consider the exponential and normal distributions and made Monte Calro simulations to compare the test statistics with some existing tests.

  • PDF

A Test of Fit for Inverse Gaussian Distribution Based on the Probability Integration Transformation (확률적분변환에 기초한 역가우스분포에 대한 적합도 검정)

  • Choi, Byungjin
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.4
    • /
    • pp.611-622
    • /
    • 2013
  • Mudholkar and Tian (2002) proposed an entropy-based test of fit for the inverse Gaussian distribution; however, the test can be applied to only the composite hypothesis of the inverse Gaussian distribution with an unknown location parameter. In this paper, we propose an entropy-based goodness-of-fit test for an inverse Gaussian distribution that can be applied to the composite hypothesis of the inverse Gaussian distribution as well as the simple hypothesis of the inverse Gaussian distribution with a specified location parameter. The proposed test is based on the probability integration transformation. The critical values of the test statistic estimated by simulations are presented in a tabular form. A simulation study is performed to compare the proposed test under some selected alternatives with Mudholkar and Tian (2002)'s test in terms of power. The results show that the proposed test has better power than the previous entropy-based test.