• Title/Summary/Keyword: Kernel Density Estimator

Search Result 44, Processing Time 0.022 seconds

Estimation of Non-Gaussian Probability Density by Dynamic Bayesian Networks

  • Cho, Hyun-C.;Fadali, Sami M.;Lee, Kwon-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.408-413
    • /
    • 2005
  • A new methodology for discrete non-Gaussian probability density estimation is investigated in this paper based on a dynamic Bayesian network (DBN) and kernel functions. The estimator consists of a DBN in which the transition distribution is represented with kernel functions. The estimator parameters are determined through a recursive learning algorithm according to the maximum likelihood (ML) scheme. A discrete-type Poisson distribution is generated in a simulation experiment to evaluate the proposed method. In addition, an unknown probability density generated by nonlinear transformation of a Poisson random variable is simulated. Computer simulations numerically demonstrate that the method successfully estimates the unknown probability distribution function (PDF).

  • PDF

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

Online Probability Density Estimation of Nonstationary Random Signal using Dynamic Bayesian Networks

  • Cho, Hyun-Cheol;Fadali, M. Sami;Lee, Kwon-Soon
    • International Journal of Control, Automation, and Systems
    • /
    • v.6 no.1
    • /
    • pp.109-118
    • /
    • 2008
  • We present two estimators for discrete non-Gaussian and nonstationary probability density estimation based on a dynamic Bayesian network (DBN). The first estimator is for off line computation and consists of a DBN whose transition distribution is represented in terms of kernel functions. The estimator parameters are the weights and shifts of the kernel functions. The parameters are determined through a recursive learning algorithm using maximum likelihood (ML) estimation. The second estimator is a DBN whose parameters form the transition probabilities. We use an asymptotically convergent, recursive, on-line algorithm to update the parameters using observation data. The DBN calculates the state probabilities using the estimated parameters. We provide examples that demonstrate the usefulness and simplicity of the two proposed estimators.

BERRY-ESSEEN BOUNDS OF RECURSIVE KERNEL ESTIMATOR OF DENSITY UNDER STRONG MIXING ASSUMPTIONS

  • Liu, Yu-Xiao;Niu, Si-Li
    • Bulletin of the Korean Mathematical Society
    • /
    • v.54 no.1
    • /
    • pp.343-358
    • /
    • 2017
  • Let {$X_i$} be a sequence of stationary ${\alpha}-mixing$ random variables with probability density function f(x). The recursive kernel estimators of f(x) are defined by $$\hat{f}_n(x)={\frac{1}{n\sqrt{b_n}}{\sum_{j=1}^{n}}b_j{^{-\frac{1}{2}}K(\frac{x-X_j}{b_j})\;and\;{\tilde{f}}_n(x)={\frac{1}{n}}{\sum_{j=1}^{n}}{\frac{1}{b_j}}K(\frac{x-X_j}{b_j})$$, where 0 < $b_n{\rightarrow}0$ is bandwith and K is some kernel function. Under appropriate conditions, we establish the Berry-Esseen bounds for these estimators of f(x), which show the convergence rates of asymptotic normality of the estimators.

A Berry-Esseen Type Bound in Kernel Density Estimation for a Random Left-Truncation Model

  • Asghari, P.;Fakoor, V.;Sarmad, M.
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.2
    • /
    • pp.115-124
    • /
    • 2014
  • In this paper we derive a Berry-Esseen type bound for the kernel density estimator of a random left truncated model, in which each datum (Y) is randomly left truncated and is sampled if $Y{\geq}T$, where T is the truncation random variable with an unknown distribution. This unknown distribution is estimated with the Lynden-Bell estimator. In particular the normal approximation rate, by choice of the bandwidth, is shown to be close to $n^{-1/6}$ modulo logarithmic term. We have also investigated this normal approximation rate via a simulation study.

On the Equality of Two Distributions Based on Nonparametric Kernel Density Estimator

  • Kim, Dae-Hak;Oh, Kwang-Sik
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.2
    • /
    • pp.247-255
    • /
    • 2003
  • Hypothesis testing for the equality of two distributions were considered. Nonparametric kernel density estimates were used for testing equality of distributions. Cross-validatory choice of bandwidth was used in the kernel density estimation. Sampling distribution of considered test statistic were developed by resampling method, called the bootstrap. Small sample Monte Carlo simulation were conducted. Empirical power of considered tests were compared for variety distributions.

  • PDF

The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.989-995
    • /
    • 2009
  • Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.

A STUDY ON RELATIVE EFFICIENCY OF KERNEL TYPE ESTIMATORS OF SMOOTH DISTRIBUTION FUNCTIONS

  • Jee, Eun-Sook
    • The Pure and Applied Mathematics
    • /
    • v.1 no.1
    • /
    • pp.19-24
    • /
    • 1994
  • Let P be a probability measure on the real line with Lebesque-density f. The usual estimator of the distribution function (≡df) of P for the sample $\chi$$_1$,…, $\chi$$\_$n/ is the empirical df: F$\_$n/(t)=(equation omitted). But this estimator does not take into account the smoothness of F, that is, the existence of a density f. Therefore, one should expect that an estimator which is better adapted to this situation beats the empirical df with respect to a reasonable measure of performance.(omitted)

  • PDF

FREQUENCY HISTOGRAM MODEL FOR LINE TRANSECT DATA WITH AND WITHOUT THE SHOULDER CONDITION

  • EIDOUS OMAR
    • Journal of the Korean Statistical Society
    • /
    • v.34 no.1
    • /
    • pp.49-60
    • /
    • 2005
  • In this paper we introduce a nonparametric method for estimating the probability density function of detection distances in line transect sampling. The estimator is obtained using a frequency histogram density estimation method. The asymptotic properties of the proposed estimator are derived and compared with those of the kernel estimator under the assumption that the data collected satisfy the shoulder condition. We found that the asymptotic mean square error (AMSE) of the two estimators have about the same convergence rate. The formula for the optimal histogram bin width is derived which minimizes AMSE. Moreover, the performances of the corresponding k-nearest-neighbor estimators are studied through simulation techniques. In the absence of our knowledge whether the shoulder condition is valid or not a new semi-parametric model is suggested to fit the line transect data. The performances of the proposed two estimators are studied and compared with some existing nonparametric and semiparametric estimators using simulation techniques. The results demonstrate the superiority of the new estimators in most cases considered.

A Kernel Density Signal Grouping Based on Radar Frequency Distribution (레이더 주파수 분포 기반 커널 밀도 신호 그룹화 기법)

  • Lee, Dong-Weon;Han, Jin-Woo;Lee, Won-Don
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.6
    • /
    • pp.124-132
    • /
    • 2011
  • In a modern electronic warfare, radar signal environments become more denser and complex. Therefor the capability of reliable signal analysis techniques is required for ES(Electronic warfare Support) system to identify and analysis individual emitter signals from received signals. In this paper, we propose the new signal grouping algorithm to ensure the reliable signal analysis and to reduce the cost of the signal processing steps in the ES. The proposed grouping algorithm uses KDE(Kernel Density Estimator) and its CDF(Cumulative Distribution Function) to compose windows considering the statistical distribution characteristics based on the radar frequency modulation type. Simulation results show the good performance of the proposed technique in the signal grouping.