DOI QR코드

DOI QR Code

Minimum Variance Unbiased Estimation for the Maximum Entropy of the Transformed Inverse Gaussian Random Variable by Y=X-1/2

  • Choi, Byung-Jin (Department of Applied Information Statistics, Kyonggi University)
  • Published : 2006.12.31

Abstract

The concept of entropy, introduced in communication theory by Shannon (1948) as a measure of uncertainty, is of prime interest in information-theoretic statistics. This paper considers the minimum variance unbiased estimation for the maximum entropy of the transformed inverse Gaussian random variable by $Y=X^{-1/2}$. The properties of the derived UMVU estimator is investigated.

Keywords

References

  1. Abramowitz, M. and Stegun, I.A. (1970). Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables. Dover Publications, Inc, New York
  2. Ahmed, N.A. and Gokhale, D.V. (1989). Entropy expressions and their esti -mators for multivariate distributions. IEEE Transactions on Infor -mation Theory, Vol. 35, 688-692 https://doi.org/10.1109/18.30996
  3. Burbea, J. and Rao, C.R. (1982). Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. Journal of Multivariate Analysis, Vol. 12, 576-579
  4. Chhikara, R.S. and Folks, J.L. (1989). The Inverse Gaussian Distribution: Theory, Methodology, and Applications. Marcel Dekker, Inc, New York
  5. Gradshteyn, I.S. and Ryzhik, I.M. (2000). Tables of Integrals, Series, and Productsiiss: Edition). Academic Press, San Diego
  6. Havrda, J. and Charvat, F. (1967). Quantification method in classification processes: concept of structural a-entropy. Kybernetika, Vol. 3, 30-35
  7. Hogg, R.V., McKean, J.W. and Craig, A.T. (2005). Introduction to Math -ematical Statistics. Pearson Education, Inc. Upper Saddle River
  8. Jayens, E.T. (1957). Information theory and statistical mechanics. Physical Review, Vol. 106, 620-630 https://doi.org/10.1103/PhysRev.106.620
  9. Kapur, J.N. and Kesavan, H.K. (1992). Entropy Optimization Principles with Applications. Academic Press, San Diego
  10. Michael, J.R., Schucany, W.R. and Hass, R.W. (1976). Generating random variables using transformation with multiple roots. The American Statistician, Vol. 30, 88-90 https://doi.org/10.2307/2683801
  11. Mudholkar, G.S. and Tian L. (2002). An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test. Journal of Statistical Planning and Inference, Vol. 102, 211-221 https://doi.org/10.1016/S0378-3758(01)00099-4
  12. Olshen, A.C. (1937). Transformations of the Pearson type III distributions. The Annals of Mathematical Statistics, Vol. 8, 176-200
  13. Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal, Vol. 27, 379-423, 623-656 https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
  14. Singh, V.P. (1998). Entropy-Based Parameter Estimation in Hydrology. Kluwer Academic Publishers, Dordrecht, The Netherlands

Cited by

  1. A Modi ed Entropy-Based Goodness-of-Fit Tes for Inverse Gaussian Distribution vol.24, pp.2, 2011, https://doi.org/10.5351/KJAS.2011.24.2.383