• Title/Summary/Keyword: Regression Estimator

Search Result 311, Processing Time 0.022 seconds

On Rice Estimator in Simple Regression Models with Outliers (이상치가 존재하는 단순회귀모형에서 Rice 추정량에 관해서)

  • Park, Chun Gun
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.3
    • /
    • pp.511-520
    • /
    • 2013
  • Detection outliers and robust estimators are crucial in regression models with outliers. In such studies the focus is on detecting outliers and estimating the coefficients using leave-one-out. Our study introduces Rice estimator which is an error variance estimator without estimating the coefficients. In particular, we study a comparison of the statistical properties for Rice estimator with and without outliers in simple regression models.

A note on nonparametric density deconvolution by weighted kernel estimators

  • Lee, Sungho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.951-959
    • /
    • 2014
  • Recently Hazelton and Turlach (2009) proposed a weighted kernel density estimator for the deconvolution problem. In the case of Gaussian kernels and measurement error, they argued that the weighted kernel density estimator is a competitive estimator over the classical deconvolution kernel estimator. In this paper we consider weighted kernel density estimators when sample observations are contaminated by double exponentially distributed errors. The performance of the weighted kernel density estimators is compared over the classical deconvolution kernel estimator and the kernel density estimator based on the support vector regression method by means of a simulation study. The weighted density estimator with the Gaussian kernel shows numerical instability in practical implementation of optimization function. However the weighted density estimates with the double exponential kernel has very similar patterns to the classical kernel density estimates in the simulations, but the shape is less satisfactory than the classical kernel density estimator with the Gaussian kernel.

The Weight Function in BIRQ Estimator for the AR(1) Model with Additive Outliers

  • Jung Byoung Cheol;Han Sang Moon
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.129-134
    • /
    • 2004
  • In this study, we investigate the effects of the weight function in the bounded influence regression quantile (BIRQ) estimator for the AR(1) model with additive outliers. In order to down-weight the outliers of X-axis, the Mallows' (1973) weight function has been commonly used in the BIRQ estimator. However, in our Monte Carlo study, the BIRQ estimator using the Tukey's bisquare weight function shows less MSE and bias than that of using the Mallows' weight function or Huber's weight function.

  • PDF

Adaptive L-estimation for regression slope under asymmetric error distributions (비대칭 오차모형하에서의 회귀기울기에 대한 적합된 L-추정법)

  • 한상문
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.1
    • /
    • pp.79-93
    • /
    • 1993
  • We consider adaptive L-estimation of estimating slope parameter in regression model. The proposed estimator is simple extension of trimmed least squares estimator proposed by ruppert and carroll. The efficiency of the proposed estimator is especially well compared with usual least squares estimator, least absolute value estimator, and M-estimators designed for asymmetric distributions under asymmetric error distributions.

  • PDF

A Study on Bandwith Selection Based on ASE for Nonparametric Regression Estimator

  • Kim, Tae-Yoon
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.21-30
    • /
    • 2001
  • Suppose we observe a set of data (X$_1$,Y$_1$(, …, (X$_{n}$,Y$_{n}$) and use the Nadaraya-Watson regression estimator to estimate m(x)=E(Y│X=x). in this article bandwidth selection problem for the Nadaraya-Watson regression estimator is investigated. In particular cross validation method based on average square error(ASE) is considered. Theoretical results here include a central limit theorem that quantifies convergence rates of the bandwidth selector.tor.

  • PDF

Asymptotically Efficient L-Estimation for Regression Slope When Trimming is Given (절사가 주어질때 회귀기울기의 점근적 최량 L-추정법)

  • Sang Moon Han
    • The Korean Journal of Applied Statistics
    • /
    • v.7 no.2
    • /
    • pp.173-182
    • /
    • 1994
  • By applying slope estimator under the arbitrary error distributions proposed by Han(1993), if we define regression quantiles to give upper and lower trimming part and blocks of data, we show the proposed slope estimator has asymptotically efficient slope estimator when the number of regression quantiles to from blocks of data goes to sufficiently large.

  • PDF

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

The Weight Function in the Bounded Influence Regression Quantile Estimator for the AR(1) Model with Additive Outliers

  • Jung Byoung Cheol;Han Sang Moon
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.169-179
    • /
    • 2005
  • In this study, we investigate the effects of the weight function in the bounded influence regression quantile (BIRQ) estimator for the AR(l) model with additive outliers. In order to down-weight the outliers of X -axis, the Mallows' (1973) weight function has been commonly used in the BIRQ estimator. However, in our Monte Carlo study, the BIRQ estimator using the Tukey's bisquare weight function shows less MSE and bias than that of using the Mallows' weight function or Huber's weight function. Thus, the use of the Tukey's weight function is recommended in the BIRQ estimator for our model.

The Use Ridge Regression for Yield Prediction Models with Multicollinearity Problems (수확예측(收穫豫測) Model의 Multicollinearity 문제점(問題點) 해결(解決)을 위(爲)한 Ridge Regression의 이용(利用))

  • Shin, Man Yong
    • Journal of Korean Society of Forest Science
    • /
    • v.79 no.3
    • /
    • pp.260-268
    • /
    • 1990
  • Two types of ridge regression estimators were compared with the ordinary least squares (OLS) estimator in order to select the "best" estimator when multicollinearitc existed. The ridge estimators were Mallows's (1973) $C_P$-like statistic, and Allen's (1974) PRESS-like statistic. The evaluation was conducted based on the predictive ability of a yield model developed by Matney et al. (1988). A total of 522 plots from the data of the Southwide Loblolly Pine Seed Source study was used in this study. All of ridge estimators were better in predictive ability than the OLS estimator. The ridge estimator obtained by using Mallows's statistic performed the best. Thus, ridge estimators can be recommended as an alternative estimator when multicollinearity exists among independent variables.

  • PDF

A SIMPLE VARIANCE ESTIMATOR IN NONPARAMETRIC REGRESSION MODELS WITH MULTIVARIATE PREDICTORS

  • Lee Young-Kyung;Kim Tae-Yoon;Park Byeong-U.
    • Journal of the Korean Statistical Society
    • /
    • v.35 no.1
    • /
    • pp.105-114
    • /
    • 2006
  • In this paper we propose a simple and computationally attractive difference-based variance estimator in nonparametric regression models with multivariate predictors. We show that the estimator achieves $n^{-1/2}$ rate of convergence for regression functions with only a first derivative when d, the dimension of the predictor, is less than or equal to 4. When d > 4, the rate turns out to be $n^{-4/(d+4)}$ under the first derivative condition for the regression functions. A numerical study suggests that the proposed estimator has a good finite sample performance.