• Title/Summary/Keyword: entropy estimator

Search Result 29, Processing Time 0.024 seconds

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Goodness-of-fit Tests for the Weibull Distribution Based on the Sample Entropy

  • Kang, Suk-Bok;Lee, Hwa-Jung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.259-268
    • /
    • 2006
  • For Type-II censored sample, we propose three modified entropy estimators based on the Vasieck's estimator, van Es' estimator, and Correa's estimator. We also propose the goodness-of-fit tests of the Weibull distribution based on the modified entropy estimators. We simulate the mean squared errors (MSE) of the proposed entropy estimators and the powers of the proposed tests. We also compare the proposed tests with the modified Kolmogorov-Smirnov and Cramer-von-Mises tests which were proposed by Kang et al. (2003).

  • PDF

Comparison of Two Parametric Estimators for the Entropy of the Lognormal Distribution (로그정규분포의 엔트로피에 대한 두 모수적 추정량의 비교)

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.5
    • /
    • pp.625-636
    • /
    • 2011
  • This paper proposes two parametric entropy estimators, the minimum variance unbiased estimator and the maximum likelihood estimator, for the lognormal distribution for a comparison of the properties of the two estimators. The variances of both estimators are derived. The influence of the bias of the maximum likelihood estimator on estimation is analytically revealed. The distributions of the proposed estimators obtained by the delta approximation method are also presented. Performance comparisons are made with the two estimators. The following observations are made from the results. The MSE efficacy of the minimum variance unbiased estimator appears consistently high and increases rapidly as the sample size and variance, n and ${\sigma}^2$, become simultaneously small. To conclude, the minimum variance unbiased estimator outperforms the maximum likelihood estimator.

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.657-667
    • /
    • 2006
  • The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.

Goodness-of-fit test for normal distribution based on parametric and nonparametric entropy estimators (모수적 엔트로피 추정량과 비모수적 엔트로피 추정량에 기초한 정규분포에 대한 적합도 검정)

  • Choi, Byungjin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.4
    • /
    • pp.847-856
    • /
    • 2013
  • In this paper, we deal with testing goodness-of-fit for normal distribution based on parametric and nonparametric entropy estimators. The minimum variance unbiased estimator for the entropy of the normal distribution is derived as a parametric entropy estimator to be used for the construction of a test statistic. For a nonparametric entropy estimator of a data-generating distribution under the alternative hypothesis sample entropy and its modifications are used. The critical values of the proposed tests are estimated by Monte Carlo simulations and presented in a tabular form. The performance of the proposed tests under some selected alternatives are investigated by means of simulations. The results report that the proposed tests have better power than the previous entropy-based test by Vasicek (1976). In applications, the new tests are expected to be used as a competitive tool for testing normality.

Modified Mass-Preserving Sample Entropy

  • Kim, Chul-Eung;Park, Sang-Un
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.1
    • /
    • pp.13-19
    • /
    • 2002
  • In nonparametric entropy estimation, both mass and mean-preserving maximum entropy distribution (Theil, 1980) and the underlying distribution of the sample entropy (Vasicek, 1976), the most widely used entropy estimator, consist of nb mass-preserving densities based on disjoint Intervals of the simple averages of two adjacent order statistics. In this paper, we notice that those nonparametric density functions do not actually keep the mass-preserving constraint, and propose a modified sample entropy by considering the generalized 0-statistics (Kaigh and Driscoll, 1987) in averaging two adjacent order statistics. We consider the proposed estimator in a goodness of fit test for normality and compare its performance with that of the sample entropy.

Testing Uniformity Based on Vasicek's Estimator

  • Kim, Jong-Tae;Cha, Young-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.119-127
    • /
    • 2004
  • To test uniformity of a population, we modify the test statistic based on the sample entropy in the literature, and establish its limiting distribution under weaker conditions, which improves the existing results. It is also to study the proposed test statistic based on Vasicek's entropy estimator is consistent.

  • PDF

Improving Sample Entropy Based on Nonparametric Quantile Estimation

  • Park, Sang-Un;Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.4
    • /
    • pp.457-465
    • /
    • 2011
  • Sample entropy (Vasicek, 1976) has poor performance, and several nonparametric entropy estimators have been proposed as alternatives. In this paper, we consider a piecewise uniform density function based on quantiles, which enables us to evaluate entropy in each interval, and study the poor performance of the sample entropy in terms of the poor estimation of lower and upper quantiles. Then we propose some improved entropy estimators by simply modifying the quantile estimators, and compare their performances with some existing estimators.

On Information Theoretic Index for Measuring the Stochastic Dependence Among Sets of Variates

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • v.26 no.1
    • /
    • pp.131-146
    • /
    • 1997
  • In this paper the problem of measuring the stochastic dependence among sets fo random variates is considered, and attention is specifically directed to forming a single well-defined measure of the dependence among sets of normal variates. A new information theoretic measure of the dependence called dependence index (DI) is introduced and its several properties are studied. The development of DI is based on the generalization and normalization of the mutual information introduced by Kullback(1968). For data analysis, minimum cross entropy estimator of DI is suggested, and its asymptotic distribution is obtained for testing the existence of the dependence. Monte Carlo simulations demonstrate the performance of the estimator, and show that is is useful not only for evaluation of the dependence, but also for independent model testing.

  • PDF

Hierarchical and Empirical Bayes Estimators of Gamma Parameter under Entropy Loss

  • Chung, Youn-Shik
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.1
    • /
    • pp.221-235
    • /
    • 1999
  • Let be $X_1$,...,$X_p$, $p\geq2$ independent random variables where each $X_i$ has a gamma distribution with $\textit{k}_i$ and $\theta_i$ The problem is to simultaneously estimate $\textit{p}$ gamma parameters $\theta_i$ and $\theta_i{^-1}$ under entropy loss where the parameters are believed priori. Hierarch ical Bayes(HB) and empirical Bayes(EB) estimators are investigated. And a preference of HB estimator over EB estimator is shown using Gibbs sampler(Gelfand and Smith 1990). Finally computer simulation is studied to compute the risk percentage improvements of the HB estimator and the estimator of Dey Ghosh and Srinivasan(1987) compared to UMVUE estimator of $\theta^{-1}$.

  • PDF