• Title/Summary/Keyword: Kernel estimator

Search Result 95, Processing Time 0.056 seconds

A Berry-Esseen Type Bound in Kernel Density Estimation for a Random Left-Truncation Model

  • Asghari, P.;Fakoor, V.;Sarmad, M.
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.2
    • /
    • pp.115-124
    • /
    • 2014
  • In this paper we derive a Berry-Esseen type bound for the kernel density estimator of a random left truncated model, in which each datum (Y) is randomly left truncated and is sampled if $Y{\geq}T$, where T is the truncation random variable with an unknown distribution. This unknown distribution is estimated with the Lynden-Bell estimator. In particular the normal approximation rate, by choice of the bandwidth, is shown to be close to $n^{-1/6}$ modulo logarithmic term. We have also investigated this normal approximation rate via a simulation study.

A Study on Bandwith Selection Based on ASE for Nonparametric Regression Estimator

  • Kim, Tae-Yoon
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.21-30
    • /
    • 2001
  • Suppose we observe a set of data (X$_1$,Y$_1$(, …, (X$_{n}$,Y$_{n}$) and use the Nadaraya-Watson regression estimator to estimate m(x)=E(Y│X=x). in this article bandwidth selection problem for the Nadaraya-Watson regression estimator is investigated. In particular cross validation method based on average square error(ASE) is considered. Theoretical results here include a central limit theorem that quantifies convergence rates of the bandwidth selector.tor.

  • PDF

A STUDY ON RELATIVE EFFICIENCY OF KERNEL TYPE ESTIMATORS OF SMOOTH DISTRIBUTION FUNCTIONS

  • Jee, Eun-Sook
    • The Pure and Applied Mathematics
    • /
    • v.1 no.1
    • /
    • pp.19-24
    • /
    • 1994
  • Let P be a probability measure on the real line with Lebesque-density f. The usual estimator of the distribution function (≡df) of P for the sample $\chi$$_1$,…, $\chi$$\_$n/ is the empirical df: F$\_$n/(t)=(equation omitted). But this estimator does not take into account the smoothness of F, that is, the existence of a density f. Therefore, one should expect that an estimator which is better adapted to this situation beats the empirical df with respect to a reasonable measure of performance.(omitted)

  • PDF

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

Kernel Ridge Regression with Randomly Right Censored Data

  • Shim, Joo-Yong;Seok, Kyung-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.2
    • /
    • pp.205-211
    • /
    • 2008
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The iterative reweighted least squares(IRWLS) procedure is employed to treat censored observations. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation(GCV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

On the Equality of Two Distributions Based on Nonparametric Kernel Density Estimator

  • Kim, Dae-Hak;Oh, Kwang-Sik
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.2
    • /
    • pp.247-255
    • /
    • 2003
  • Hypothesis testing for the equality of two distributions were considered. Nonparametric kernel density estimates were used for testing equality of distributions. Cross-validatory choice of bandwidth was used in the kernel density estimation. Sampling distribution of considered test statistic were developed by resampling method, called the bootstrap. Small sample Monte Carlo simulation were conducted. Empirical power of considered tests were compared for variety distributions.

  • PDF

A STUDY ON KERNEL ESTIMATION OF A SMOOTH DISTRIBUTION FUNCTION ON CENSORED DATA

  • Jee, Eun Sook
    • The Mathematical Education
    • /
    • v.31 no.2
    • /
    • pp.133-140
    • /
    • 1992
  • The problem of estimating a smooth distribution function F at a point $\tau$ based on randomly right censored data is treated under certain smoothness conditions on F . The asymptotic performance of a certain class of kernel estimators is compared to that of the Kap lan-Meier estimator of F($\tau$). It is shown that the .elative deficiency of the Kaplan-Meier estimate. of F($\tau$) with respect to the appropriately chosen kernel type estimate. tends to infinity as the sample size n increases to infinity. Strong uniform consistency and the weak convergence of the normalized process are also proved.

  • PDF

Comparison study on kernel type estimators of discontinuous log-variance (불연속 로그분산함수의 커널추정량들의 비교 연구)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.1
    • /
    • pp.87-95
    • /
    • 2014
  • In the regression model, Kang and Huh (2006) studied the estimation of the discontinuous variance function using the Nadaraya-Watson estimator with the squared residuals. The local linear estimator of the log-variance function, which may have the whole real number, was proposed by Huh (2013) based on the kernel weighted local-likelihood of the ${\chi}^2$-distribution. Chen et al. (2009) estimated the continuous variance function using the local linear fit with the log-squared residuals. In this paper, the estimator of the discontinuous log-variance function itself or its derivative using Chen et al. (2009)'s estimator. Numerical works investigate the performances of the estimators with simulated examples.

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

Modification of boundary bias in nonparametric regression (비모수적 회귀선추정의 바운더리 편의 수정)

  • 차경준
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.2
    • /
    • pp.329-339
    • /
    • 1993
  • Kernel regression is a nonparametric regression technique which requires only differentiability of the true function. If one wants to use the kernel regression technique to produce smooth estimates of a curve over a finite interval, one can realize that there exist distinct boundary problems that detract from the global performance of the estimator. This paper develops a kernel to handle boundary problem. In order to develop the boundary kernel, a generalized jacknife method by Gray and Schucany (1972) is adapted. Also, it will be shown that the boundary kernel has the same order of convergence rate as non-boundary.

  • PDF