• Title/Summary/Keyword: LAD estimator

Search Result 11, Processing Time 0.03 seconds

Comparison of Bootstrap Methods for LAD Estimator in AR(1) Model

  • Kang, Kee-Hoon;Shin, Key-Il
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.745-754
    • /
    • 2006
  • It has been shown that LAD estimates are more efficient than LS estimates when the error distribution is double exponential in AR(1) model. In order to explore the performance of LAD estimates one can use bootstrap approaches. In this paper we consider the efficiencies of bootstrap methods when we apply LAD estimates with highly variable data. Monte Carlo simulation results are given for comparing generalized bootstrap, stationary bootstrap and threshold bootstrap methods.

Trimmed LAD Estimators for Multidimensional Contingency Tables (분할표 분석을 위한 절사 LAD 추정량과 최적 절사율 결정)

  • Choi, Hyun-Jip
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.6
    • /
    • pp.1235-1243
    • /
    • 2010
  • This study proposes a trimmed LAD(least absolute deviation) estimators for multi-dimensional contingency tables and suggests an algorithm to estimate it. In addition, a method to determine the trimming quantity of the estimators is suggested. A Monte Carlo study shows that the propose method yields a better trimming rate and coverage rate than the previously suggest method based on the determinant of the covariance matrix.

Bootstrap of LAD Estimate in Infinite Variance AR(1) Processes

  • Kang, Hee-Jeong
    • Journal of the Korean Statistical Society
    • /
    • v.26 no.3
    • /
    • pp.383-395
    • /
    • 1997
  • This paper proves that the standard bootstrap approximation for the least absolute deviation (LAD) estimate of .beta. in AR(1) processes with infinite variance error terms is asymptotically valid in probability when the bootstrap resample size is much smaller than the original sample size. The theoretical validity results are supported by simulation studies.

  • PDF

L-Estimation for the Parameter of the AR(l) Model (AR(1) 모형의 모수에 대한 L-추정법)

  • Han Sang Moon;Jung Byoung Cheal
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.1
    • /
    • pp.43-56
    • /
    • 2005
  • In this study, a robust estimation method for the first-order autocorrelation coefficient in the time series model following AR(l) process with additive outlier(AO) is investigated. We propose the L-type trimmed least squares estimation method using the preliminary estimator (PE) suggested by Rupport and Carroll (1980) in multiple regression model. In addition, using Mallows' weight function in order to down-weight the outlier of X-axis, the bounded-influence PE (BIPE) estimator is obtained and the mean squared error (MSE) performance of various estimators for autocorrelation coefficient are compared using Monte Carlo experiments. From the results of Monte-Carlo study, the efficiency of BIPE(LAD) estimator using the generalized-LAD to preliminary estimator performs well relative to other estimators.

Strong Representations for LAD Estimators in AR(1) Models

  • Kang, Hee-Jeong;Shin, Key-Il
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.3
    • /
    • pp.349-358
    • /
    • 1998
  • Consider the AR(1) model $X_{t}$=$\beta$ $X_{t-1}$+$\varepsilon$$_{t}$ where $\beta$ < 1 is an unknown parameter to be estimated and {$\varepsilon$$_{t}$} denotes the independent and identically distributed error terms with unknown common distribution function F. In this paper, a strong representation for the least absolute deviation (LAD) estimate of $\beta$ in AR(1) models is obtained under some mild conditions on F. on F.F.

  • PDF

Weighted Least Absolute Deviation Lasso Estimator

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.6
    • /
    • pp.733-739
    • /
    • 2011
  • The linear absolute shrinkage and selection operator(Lasso) method improves the low prediction accuracy and poor interpretation of the ordinary least squares(OLS) estimate through the use of $L_1$ regularization on the regression coefficients. However, the Lasso is not robust to outliers, because the Lasso method minimizes the sum of squared residual errors. Even though the least absolute deviation(LAD) estimator is an alternative to the OLS estimate, it is sensitive to leverage points. We propose a robust Lasso estimator that is not sensitive to outliers, heavy-tailed errors or leverage points.

A Comparison of Robust Parameter Estimations for Autoregressive Models (자기회귀모형에서의 로버스트한 모수 추정방법들에 관한 연구)

  • Kang, Hee-Jeong;Kim, Soon-Young
    • Journal of the Korean Data and Information Science Society
    • /
    • v.11 no.1
    • /
    • pp.1-18
    • /
    • 2000
  • In this paper, we study several parameter estimation methods used for autoregressive processes and compare them in view of forecasting. The least square estimation, least absolute deviation estimation, robust estimation are compared through Monte Carlo simulations.

  • PDF

Test of Hypotheses based on LAD Estimators in Nonlinear Regression Models

  • Seung Hoe Choi
    • Communications for Statistical Applications and Methods
    • /
    • v.2 no.2
    • /
    • pp.288-295
    • /
    • 1995
  • In this paper a hypotheses test procedure based on the least absolute deviation estimators for the unknown parameters in nonlinear regression models is investigated. The asymptotic distribution of the proposed likelihood ratio test statistic are established voth under the null hypotheses and a sequence of local alternative hypotheses. The asymptotic relative efficiency of the proposed test with classical test based on the least squares estimator is also discussed.

  • PDF

On Asymptotic Properties of Bootstrap for Autoregressive Processes with Regularly Varying Tail Probabilities

  • Kang, Hee-Jeong
    • Journal of the Korean Statistical Society
    • /
    • v.26 no.1
    • /
    • pp.31-46
    • /
    • 1997
  • Let $X_{t}$ = .beta. $X_{{t-1}}$ + .epsilon.$_{t}$ be an autoregressive process where $\mid$.beta.$\mid$ < 1 and {.epsilon.$_{t}$} is independent and identically distriubted with regularly varying tail probabilities. This process is called the asymptotically stationary first-order autoregressive process (AR(1)) with infinite variance. In this paper, we obtain a host of weak convergences of some point processes based on bootstrapping of { $X_{t}$}. These kinds of results can be generalized under the infinite variance assumption to ensure the asymptotic validity of the bootstrap method for various functionals of { $X_{t}$} such as partial sums, sample covariance and sample correlation functions, etc.ions, etc.

  • PDF