• Title/Summary/Keyword: Vasicek's test

Search Result 4, Processing Time 0.016 seconds

ENTROPY-BASED GOODNESS OF FIT TEST FOR A COMPOSITE HYPOTHESIS

  • Lee, Sangyeol
    • Bulletin of the Korean Mathematical Society
    • /
    • v.53 no.2
    • /
    • pp.351-363
    • /
    • 2016
  • In this paper, we consider the entropy-based goodness of fit test (Vasicek's test) for a composite hypothesis. The test measures the discrepancy between the nonparametric entropy estimate and the parametric entropy estimate obtained from an assumed parametric family of distributions. It is shown that the proposed test is asymptotically normal under regularity conditions, but is affected by parameter estimates. As a remedy, a bootstrap version of Vasicek's test is proposed. Simulation results are provided for illustration.

Testing Uniformity Based on Vasicek's Estimator

  • Kim, Jong-Tae;Cha, Young-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.119-127
    • /
    • 2004
  • To test uniformity of a population, we modify the test statistic based on the sample entropy in the literature, and establish its limiting distribution under weaker conditions, which improves the existing results. It is also to study the proposed test statistic based on Vasicek's entropy estimator is consistent.

  • PDF

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Tests for Exponentiality by Kullback-Leibler Information (지수분포의 검정을 위한 쿨백-레이블러 정보함수)

  • 김종태;이우동;강석복
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.5 no.2
    • /
    • pp.39-46
    • /
    • 2000
  • Recent]y van Es (1992) and Correa (1995) proposed an estimator of entropy. In this paper, we proposed the goodness of fit test statistics for exponentiality based on Vasicek's estimator and Correa's estimator of Kullback-Leibier Information. And we compare the power of the proposed test statistics with Kolmogorov-Sminov, Kuiper, Cramer von Mises, Watson, Andersen-Darling and Finkelstein and Schefer statistics.

  • PDF