• Title/Summary/Keyword: density estimator

Search Result 132, Processing Time 0.02 seconds

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter

  • Pak, Ro-Jin
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.213-220
    • /
    • 2006
  • Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.

  • PDF

Piecewise Continuous Linear Density Estimator

  • Jang, Dae-Heung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.959-968
    • /
    • 2005
  • The piecewise linear histogram can be used as a simple and efficient tool for the density estimator. But, this piecewise linear histogram is discontinuous function. We suppose the piecewise continuous linear histogram as a simple and efficient tool for the density estimator and the alternative of the piecewise linear histogram.

  • PDF

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.

A Support Vector Method for the Deconvolution Problem

  • Lee, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.3
    • /
    • pp.451-457
    • /
    • 2010
  • This paper considers the problem of nonparametric deconvolution density estimation when sample observa-tions are contaminated by double exponentially distributed errors. Three different deconvolution density estima-tors are introduced: a weighted kernel density estimator, a kernel density estimator based on the support vector regression method in a RKHS, and a classical kernel density estimator. The performance of these deconvolution density estimators is compared by means of a simulation study.

A Robust Estimation for the Composite Lognormal-Pareto Model

  • Pak, Ro Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.4
    • /
    • pp.311-319
    • /
    • 2013
  • Cooray and Ananda (2005) proposed a composite lognormal-Pareto model to analyze loss payment data in the actuarial and insurance industries. Their model is based on a lognormal density up to an unknown threshold value and a two-parameter Pareto density. In this paper, we implement the minimum density power divergence estimation for the composite lognormal-Pareto density. We compare the performances of the minimum density power divergence estimator (MDPDE) and the maximum likelihood estimator (MLE) by simulations and an example. The minimum density power divergence estimator performs reasonably well against various violations in the distribution. The minimum density power divergence estimator better fits small observations and better resists against extraordinary large observations than the maximum likelihood estimator.

A study on bandwith selection based on ASE for nonparametric density estimators

  • Kim, Tae-Yoon
    • Journal of the Korean Statistical Society
    • /
    • v.29 no.3
    • /
    • pp.307-313
    • /
    • 2000
  • Suppose we have a set of data X1, ···, Xn and employ kernel density estimator to estimate the marginal density of X. in this article bandwith selection problem for kernel density estimator is examined closely. In particular the Kullback-Leibler method (a bandwith selection methods based on average square error (ASE)) is considered.

  • PDF

The Bandwidth from the Density Power Divergence

  • Pak, Ro Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.5
    • /
    • pp.435-444
    • /
    • 2014
  • The most widely used optimal bandwidth is known to minimize the mean integrated squared error(MISE) of a kernel density estimator from a true density. In this article proposes, we propose a bandwidth which asymptotically minimizes the mean integrated density power divergence(MIDPD) between a true density and a corresponding kernel density estimator. An approximated form of the mean integrated density power divergence is derived and a bandwidth is obtained as a product of minimization based on the approximated form. The resulting bandwidth resembles the optimal bandwidth by Parzen (1962), but it reflects the nature of a model density more than the existing optimal bandwidths. We have one more choice of an optimal bandwidth with a firm theoretical background; in addition, an empirical study we show that the bandwidth from the mean integrated density power divergence can produce a density estimator fitting a sample better than the bandwidth from the mean integrated squared error.

Modified Local Density Estimation for the Log-Linear Density

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.1
    • /
    • pp.13-22
    • /
    • 2000
  • We consider local likelihood method with a smoothed version of the model density in stead of an original model density. For simplicity a model is assumed as the log-linear density then we were able to show that the proposed local density estimator is less affected by changes among observations but its bias increases little bit more than that of the currently used local density estimator. Hence if we use the existing method and the proposed method in a proper way we would derive the local density estimator fitting the data in a better way.

  • PDF

On Copas′ Local Likelihood Density Estimator

  • Kim, W.C.;Park, B.U.;Kim, Y.G.
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.77-87
    • /
    • 2001
  • Some asymptotic results on the local likelihood density estimator of Copas(1995) are derived when the locally parametric model has several parameters. It turns out that it has the same asymptotic mean squared error as that of Hjort and Jones(1996).

  • PDF