• Title/Summary/Keyword: kernel density

Search Result 301, Processing Time 0.024 seconds

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

A Support Vector Method for the Deconvolution Problem

  • Lee, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.3
    • /
    • pp.451-457
    • /
    • 2010
  • This paper considers the problem of nonparametric deconvolution density estimation when sample observa-tions are contaminated by double exponentially distributed errors. Three different deconvolution density estima-tors are introduced: a weighted kernel density estimator, a kernel density estimator based on the support vector regression method in a RKHS, and a classical kernel density estimator. The performance of these deconvolution density estimators is compared by means of a simulation study.

Jackknife Kernel Density Estimation Using Uniform Kernel Function in the Presence of k's Unidentified Outliers

  • Woo, Jung-Soo;Lee, Jang-Choon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.6 no.1
    • /
    • pp.85-96
    • /
    • 1995
  • The purpose of this paper is to propose the kernel density estimator and the jackknife kernel density estimator in the presence of k's unidentified outliers, and to compare the small sample performances of the proposed estimators in a sense of mean integrated square error(MISE).

  • PDF

The shifted Chebyshev series-based plug-in for bandwidth selection in kernel density estimation

  • Soratja Klaichim;Juthaphorn Sinsomboonthong;Thidaporn Supapakorn
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.3
    • /
    • pp.337-347
    • /
    • 2024
  • Kernel density estimation is a prevalent technique employed for nonparametric density estimation, enabling direct estimation from the data itself. This estimation involves two crucial elements: selection of the kernel function and the determination of the appropriate bandwidth. The selection of the bandwidth plays an important role in kernel density estimation, which has been developed over the past decade. A range of methods is available for selecting the bandwidth, including the plug-in bandwidth. In this article, the proposed plug-in bandwidth is introduced, which leverages shifted Chebyshev series-based approximation to determine the optimal bandwidth. Through a simulation study, the performance of the suggested bandwidth is analyzed to reveal its favorable performance across a wide range of distributions and sample sizes compared to alternative bandwidths. The proposed bandwidth is also applied for kernel density estimation on real dataset. The outcomes obtained from the proposed bandwidth indicate a favorable selection. Hence, this article serves as motivation to explore additional plug-in bandwidths that rely on function approximations utilizing alternative series expansions.

The Region of Positivity and Unimodality in the Truncated Series of a Nonparametric Kernel Density Estimator

  • Gupta, A.K.;Im, B.K.K.
    • Journal of the Korean Statistical Society
    • /
    • v.10
    • /
    • pp.140-144
    • /
    • 1981
  • This paper approximates to a kernel density estimate by a truncated series of expansion involving Hermite polynomials, since this could ease the computing burden involved in the kernel-based density estimation. However, this truncated series may give a multimodal estimate when we are estiamting unimodal density. In this paper we will show a way to insure the truncated series to be positive and unimodal so that the approximation to a kernel density estimator would be maeningful.

  • PDF

A study on bandwith selection based on ASE for nonparametric density estimators

  • Kim, Tae-Yoon
    • Journal of the Korean Statistical Society
    • /
    • v.29 no.3
    • /
    • pp.307-313
    • /
    • 2000
  • Suppose we have a set of data X1, ···, Xn and employ kernel density estimator to estimate the marginal density of X. in this article bandwith selection problem for kernel density estimator is examined closely. In particular the Kullback-Leibler method (a bandwith selection methods based on average square error (ASE)) is considered.

  • PDF

A Note on Deconvolution Estimators when Measurement Errors are Normal

  • Lee, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.4
    • /
    • pp.517-526
    • /
    • 2012
  • In this paper a support vector method is proposed for use when the sample observations are contaminated by a normally distributed measurement error. The performance of deconvolution density estimators based on the support vector method is explored and compared with kernel density estimators by means of a simulation study. An interesting result was that for the estimation of kurtotic density, the support vector deconvolution estimator with a Gaussian kernel showed a better performance than the classical deconvolution kernel estimator.

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.

Nonparametric Discontinuity Point Estimation in Density or Density Derivatives

  • Huh, Jib
    • Journal of the Korean Statistical Society
    • /
    • v.31 no.2
    • /
    • pp.261-276
    • /
    • 2002
  • Probability density or its derivatives may have a discontinuity/change point at an unknown location. We propose a method of estimating the location and the jump size of the discontinuity point based on kernel type density or density derivatives estimators with one-sided equivalent kernels. The rates of convergence of the proposed estimators are derived, and the finite-sample performances of the methods are illustrated by simulated examples.