• Title/Summary/Keyword: Regression estimator

Search Result 311, Processing Time 0.029 seconds

An Equivariant and Robust Estimator in Multivariate Regression Based on Least Trimmed Squares

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1037-1046
    • /
    • 2003
  • We propose an equivariant and robust estimator in multivariate regression model based on the least trimmed squares (LTS) estimator in univariate regression. We call this estimator as multivariate least trimmed squares (MLTS) estimator. The MLTS estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regression. The MLTS estimator has high breakdown point as does LTS estimator in univariate case. We develop an algorithm for MLTS estimate. Simulation are performed to compare the efficiencies of MLTS estimate with coordinatewise LTS estimate and a numerical example is given to illustrate the effectiveness of MLTS estimate in multivariate regression.

Nonparametric Estimation in Regression Model

  • Han, Sang Moon
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.1
    • /
    • pp.15-27
    • /
    • 2001
  • One proposal is made for constructing nonparametric estimator of slope parameters in a regression model under symmetric error distributions. This estimator is based on the use of idea of Johns for estimating the center of the symmetric distribution together with the idea of regression quantiles and regression trimmed mean. This nonparametric estimator and some other L-estimators are studied by Monte Carlo.

  • PDF

Nonparametric Estimation using Regression Quantiles in a Regression Model

  • Han, Sang-Moon;Jung, Byoung-Cheol
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.5
    • /
    • pp.793-802
    • /
    • 2012
  • One proposal is made to construct a nonparametric estimator of slope parameters in a regression model under symmetric error distributions. This estimator is based on the use of the idea of minimizing approximate variance of a proposed estimator using regression quantiles. This nonparametric estimator and some other L-estimators are studied and compared with well known M-estimators through a simulation study.

A Robust Estimator in Multivariate Regression Using Least Quartile Difference

  • Jung Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.39-46
    • /
    • 2005
  • We propose an equivariant and robust estimator in multivariate regression model based on the least quartile difference (LQD) estimator in univariate regression. We call this estimator as the multivariate least quartile difference (MLQD) estimator. The MLQD estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regressions. The MLQD estimator has high breakdown point as does the univariate LQD estimator. We develop an algorithm for MLQD estimate. Simulations are performed to compare the efficiencies of MLQD estimate with coordinatewise LQD estimate and the multivariate least trimmed squares estimate.

VARIANCE ESTIMATION OF ERROR IN THE REGRESSION MODEL AT A POINT

  • Oh, Jong-Chul
    • Journal of applied mathematics & informatics
    • /
    • v.13 no.1_2
    • /
    • pp.501-508
    • /
    • 2003
  • Although the estimate of regression function is important, some have focused the variance estimation of error term in regression model. Different variance estimators perform well under different conditions. In many practical situations, it is rather hard to assess which conditions are approximately satisfied so as to identify the best variance estimator for the given data. In this article, we suggest SHM estimator compared to LS estimator, which is common estimator using in parametric multiple regression analysis. Moreover, a combined estimator of variance, VEM, is suggested. In the simulation study it is shown that VEM performs well in practice.

A Study on Kernel Type Discontinuity Point Estimations

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.4
    • /
    • pp.929-937
    • /
    • 2003
  • Kernel type estimations of discontinuity point at an unknown location in regression function or its derivatives have been developed. It is known that the discontinuity point estimator based on $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a zero value at the point 0 makes a poor asymptotic behavior. Further, the asymptotic variance of $Gasser-M\ddot{u}ller$ regression estimator in the random design case is 1.5 times larger that the one in the corresponding fixed design case, while those two are identical for the local polynomial regression estimator. Although $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a non-zero value at the point 0 for the modification is used, computer simulation show that this phenomenon is also appeared in the discontinuity point estimation.

  • PDF

Penalized rank regression estimator with the smoothly clipped absolute deviation function

  • Park, Jong-Tae;Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.6
    • /
    • pp.673-683
    • /
    • 2017
  • The least absolute shrinkage and selection operator (LASSO) has been a popular regression estimator with simultaneous variable selection. However, LASSO does not have the oracle property and its robust version is needed in the case of heavy-tailed errors or serious outliers. We propose a robust penalized regression estimator which provide a simultaneous variable selection and estimator. It is based on the rank regression and the non-convex penalty function, the smoothly clipped absolute deviation (SCAD) function which has the oracle property. The proposed method combines the robustness of the rank regression and the oracle property of the SCAD penalty. We develop an efficient algorithm to compute the proposed estimator that includes a SCAD estimate based on the local linear approximation and the tuning parameter of the penalty function. Our estimate can be obtained by the least absolute deviation method. We used an optimal tuning parameter based on the Bayesian information criterion and the cross validation method. Numerical simulation shows that the proposed estimator is robust and effective to analyze contaminated data.

Self-tuning Robust Regression Estimation

  • Park, You-Sung;Lee, Dong-Hee
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.257-262
    • /
    • 2003
  • We introduce a new robust regression estimator, self-tuning regression estimator. Various robust estimators have been developed with discovery for theories and applications since Huber introduced M-estimator at 1960's. We start by announcing various robust estimators and their properties, including their advantages and disadvantages, and furthermore, new estimator overcomes drawbacks of other robust regression estimators, such as ineffective computation on preserving robustness properties.

  • PDF

Pitman Nearness for a Generalized Stein-Rule Estimators of Regression Coefficients

  • R. Karan Singh;N. Rastogi
    • Journal of the Korean Statistical Society
    • /
    • v.31 no.2
    • /
    • pp.229-235
    • /
    • 2002
  • A generalized Stein-rule estimator of the vector of regression coefficients in linear regression model is considered and its properties are analyzed according to the criterion of Pitman nearness. A comparative study shows that the generalized Stein-rule estimator representing a class of estimators contains particular members which are better than the usual Stein-rule estimator according to the Pitman closeness.

First Order Difference-Based Error Variance Estimator in Nonparametric Regression with a Single Outlier

  • Park, Chun-Gun
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.3
    • /
    • pp.333-344
    • /
    • 2012
  • We consider some statistical properties of the first order difference-based error variance estimator in nonparametric regression models with a single outlier. So far under an outlier(s) such difference-based estimators has been rarely discussed. We propose the first order difference-based estimator using the leave-one-out method to detect a single outlier and simulate the outlier detection in a nonparametric regression model with the single outlier. Moreover, the outlier detection works well. The results are promising even in nonparametric regression models with many outliers using some difference based estimators.