• Title/Summary/Keyword: inverse Gaussian

Search Result 111, Processing Time 0.025 seconds

Shrinkage Estimator of Dispersion of an Inverse Gaussian Distribution

  • Lee, In-Suk;Park, Young-Soo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.3
    • /
    • pp.805-809
    • /
    • 2006
  • In this paper a shrinkage estimator for the measure of dispersion of the inverse Gaussian distribution with known mean is proposed. Also we compare the relative bias and relative efficiency of the proposed estimator with respect to minimum variance unbiased estimator.

  • PDF

Bayesian Model Selection for Inverse Gaussian Populations with Heterogeneity

  • Kang, Sang-Gil;Kim, Dal-Ho;Lee, Woo-Dong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.19 no.2
    • /
    • pp.621-634
    • /
    • 2008
  • This paper addresses the problem of testing whether the means in several inverse Gaussian populations with heterogeneity are equal. The analysis of reciprocals for the equality of inverse Gaussian means needs the assumption of equal scale parameters. We propose Bayesian model selection procedures for testing equality of the inverse Gaussian means under the noninformative prior without the assumption of equal scale parameters. The noninformative prior is usually improper which yields a calibration problem that makes the Bayes factor to be defined up to a multiplicative constant. So we propose the objective Bayesian model selection procedures based on the fractional Bayes factor and the intrinsic Bayes factor under the reference prior. Simulation study and real data analysis are provided.

  • PDF

Kullback-Leibler Information-Based Tests of Fit for Inverse Gaussian Distribution (역가우스분포에 대한 쿨백-라이블러 정보 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.6
    • /
    • pp.1271-1284
    • /
    • 2011
  • The entropy-based test of fit for the inverse Gaussian distribution presented by Mudholkar and Tian(2002) can only be applied to the composite hypothesis that a sample is drawn from an inverse Gaussian distribution with both the location and scale parameters unknown. In application, however, a researcher may want a test of fit either for an inverse Gaussian distribution with one parameter known or for an inverse Gaussian distribution with both the two partameters known. In this paper, we introduce tests of fit for the inverse Gaussian distribution based on the Kullback-Leibler information as an extension of the entropy-based test. A window size should be chosen to implement the proposed tests. By means of Monte Carlo simulations, window sizes are determined for a wide range of sample sizes and the corresponding critical values of the test statistics are estimated. The results of power analysis for various alternatives report that the Kullback-Leibler information-based goodness-of-fit tests have good power.

A Test of Fit for Inverse Gaussian Distribution Based on the Probability Integration Transformation (확률적분변환에 기초한 역가우스분포에 대한 적합도 검정)

  • Choi, Byungjin
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.4
    • /
    • pp.611-622
    • /
    • 2013
  • Mudholkar and Tian (2002) proposed an entropy-based test of fit for the inverse Gaussian distribution; however, the test can be applied to only the composite hypothesis of the inverse Gaussian distribution with an unknown location parameter. In this paper, we propose an entropy-based goodness-of-fit test for an inverse Gaussian distribution that can be applied to the composite hypothesis of the inverse Gaussian distribution as well as the simple hypothesis of the inverse Gaussian distribution with a specified location parameter. The proposed test is based on the probability integration transformation. The critical values of the test statistic estimated by simulations are presented in a tabular form. A simulation study is performed to compare the proposed test under some selected alternatives with Mudholkar and Tian (2002)'s test in terms of power. The results show that the proposed test has better power than the previous entropy-based test.

A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution (역가우스분포에 대한 변형된 엔트로피 기반 적합도 검정)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.383-391
    • /
    • 2011
  • This paper presents a modified entropy-based test of fit for the inverse Gaussian distribution. The test is based on the entropy difference of the unknown data-generating distribution and the inverse Gaussian distribution. The entropy difference estimator used as the test statistic is obtained by employing Vasicek's sample entropy as an entropy estimator for the data-generating distribution and the uniformly minimum variance unbiased estimator as an entropy estimator for the inverse Gaussian distribution. The critical values of the test statistic empirically determined are provided in a tabular form. Monte Carlo simulations are performed to compare the proposed test with the previous entropy-based test in terms of power.

Noninformative Priors for the Ratio of Parameters in Inverse Gaussian Distribution (INVERSE GAUSSIAN분포의 모수비에 대한 무정보적 사전분포에 대한 연구)

  • 강상길;김달호;이우동
    • The Korean Journal of Applied Statistics
    • /
    • v.17 no.1
    • /
    • pp.49-60
    • /
    • 2004
  • In this paper, when the observations are distributed as inverse gaussian, we developed the noninformative priors for ratio of the parameters of inverse gaussian distribution. We developed the first order matching prior and proved that the second order matching prior does not exist. It turns out that one-at-a-time reference prior satisfies a first order matching criterion. Some simulation study is performed.

Bayesian Testing for the Equality of Two Inverse Gaussian Populations with the Fractional Bayes Factor

  • Ko, Jeong-Hwan
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.3
    • /
    • pp.539-547
    • /
    • 2005
  • We propose the Bayesian testing for the equality of two independent Inverse Gaussian population means using the fractional Bayesian factors suggested by O' Hagan(1995). As prior distribution for the parameters, we assumed the noninformative priors. In order to investigate the usefulness of the proposed Bayesian testing procedures, the behaviors of the proposed results are examined via real data analysis.

  • PDF

ON TESTING THE EQUALITY OF THE COEFFICIENTS OF VARIATION IN TWO INVERSE GAUSSIAN POPULATIONS

  • Choi, Byung-Jin;Kim, Kee-Young
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.2
    • /
    • pp.93-101
    • /
    • 2003
  • This paper deals with testing the equality of the coefficients of variation in two inverse Gaussian populations. The likelihood ratio, Lagrange-multiplier and Wald tests are presented. Monte-Carlo simulations are performed to compare the powers of these tests. In a simulation study, the likelihood ratio test appears to be consistently more powerful than the Lagrange-multiplier and Wald tests when sample size is small. The powers of all the tests tend to be similar when sample size increases.

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.657-667
    • /
    • 2006
  • The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.