• Title/Summary/Keyword: minimum distance estimator

Search Result 18, Processing Time 0.028 seconds

Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.213-220
    • /
    • 2006
  • Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.

  • PDF

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

Minimum Disparity Estimation for Normal Models: Small Sample Efficiency

  • Cho M. J.;Hong C. S.;Jeong D. B.
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.149-167
    • /
    • 2005
  • The minimum disparity estimators introduced by Lindsay and Basu (1994) are studied empirically. An extensive simulation in this paper provides a location estimate of the small sample and supplies empirical evidence of the estimator performance for the univariate contaminated normal model. Empirical results show that the minimum generalized negative exponential disparity estimator (MGNEDE) obtains high efficiency for small sample sizes and dominates the maximum likelihood estimator (MLE) and the minimum blended weight Hellinger distance estimator (MBWHDE) with respect to efficiency at the contaminated model.

M-Estimation Functions Induced From Minimum L$_2$ Distance Estimation

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.4
    • /
    • pp.507-514
    • /
    • 1998
  • The minimum distance estimation based on the L$_2$ distance between a model density and a density estimator is studied from M-estimation point of view. We will show that how a model density and a density estimator are incorporated in order to create an M-estimation function. This method enables us to create an M-estimating function reflecting the natures of both an assumed model density and a given set of data. Some new types of M-estimation functions for estimating a location and scale parameters are introduced.

  • PDF

Minimum Hellinger Distance Estimation and Minimum Density Power Divergence Estimation in Estimating Mixture Proportions

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1159-1165
    • /
    • 2005
  • Basu et al. (1998) proposed a new density-based estimator, called the minimum density power divergence estimator (MDPDE), which avoid the use of nonparametric density estimation and associated complication such as bandwidth selection. Woodward et al. (1995) examined the minimum Hellinger distance estimator (MHDE), proposed by Beran (1977), in the case of estimation of the mixture proportion in the mixture of two normals. In this article, we introduce the MDPDE for a mixture proportion, and show that both the MDPDE and the MHDE have the same asymptotic distribution at a model. Simulation study identifies some cases where the MHDE is consistently better than the MDPDE in terms of bias.

  • PDF

CONSISTENT AND ASYMPTOTICALLY NORMAL ESTIMATORS FOR PERIODIC BILINEAR MODELS

  • Bibi, Abdelouahab;Gautier, Antony
    • Bulletin of the Korean Mathematical Society
    • /
    • v.47 no.5
    • /
    • pp.889-905
    • /
    • 2010
  • In this paper, a distribution free approach to the parameter estimation of a simple bilinear model with periodic coefficients is presented. The proposed method relies on minimum distance estimator based on the autocovariances of the squared process. Consistency and asymptotic normality of the estimator, as well as hypotheses testing, are derived. Numerical experiments on simulated data sets are presented to highlight the theoretical results.

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

Minimum Hellinger Distance Bsed Goodness-of-fit Tests in Normal Models: Empirical Approach

  • Dong Bin Jeong
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.3
    • /
    • pp.967-976
    • /
    • 1999
  • In this paper we study the Hellinger distance based goodness-of-fit tests that are analogs of likelihood ratio tests. The minimum Hellinger distance estimator (MHDE) in normal models provides an excellent robust alternative to the usual maximum likelihood estimator. Our simulation results show that the Hellinger deviance test (Simpson 1989) based goodness-of-fit test is robust when data contain outliers. The proposed hellinger deviance test(Simpson 1989) is a more direcct method for obtaining robust inferences than an automated outlier screen method used before the likelihood ratio test data analysis.

  • PDF

A Robust Estimation for the Composite Lognormal-Pareto Model

  • Pak, Ro Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.4
    • /
    • pp.311-319
    • /
    • 2013
  • Cooray and Ananda (2005) proposed a composite lognormal-Pareto model to analyze loss payment data in the actuarial and insurance industries. Their model is based on a lognormal density up to an unknown threshold value and a two-parameter Pareto density. In this paper, we implement the minimum density power divergence estimation for the composite lognormal-Pareto density. We compare the performances of the minimum density power divergence estimator (MDPDE) and the maximum likelihood estimator (MLE) by simulations and an example. The minimum density power divergence estimator performs reasonably well against various violations in the distribution. The minimum density power divergence estimator better fits small observations and better resists against extraordinary large observations than the maximum likelihood estimator.

Negative Exponential Disparity Based Deviance and Goodness-of-fit Tests for Continuous Models: Distributions, Efficiency and Robustness

  • Jeong, Dong-Bin;Sahadeb Sarkar
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.41-61
    • /
    • 2001
  • The minimum negative exponential disparity estimator(MNEDE), introduced by Lindsay(1994), is an excellenet competitor to the minimum Hellinger distance estimator(Beran 1977) as a robust and yet efficient alternative to the maximum likelihood estimator in parametric models. In this paper we define the negative exponential deviance test(NEDT) as an analog of the likelihood ratio test(LRT), and show that the NEDT is asymptotically equivalent to he LRT at the model and under a sequence of contiguous alternatives. We establish that the asymptotic strong breakdown point for a class of minimum disparity estimators, containing the MNEDE, is at least 1/2 in continuous models. This result leads us to anticipate robustness of the NEDT under data contamination, and we demonstrate it empirically. In fact, in the simulation settings considered here the empirical level of the NEDT show more stability than the Hellinger deviance test(Simpson 1989). The NEDT is illustrated through an example data set. We also define a goodness-of-fit statistic to assess adequacy of a specified parametric model, and establish its asymptotic normality under the null hypothesis.

  • PDF