• 제목/요약/키워드: local linear estimator

검색결과 26건 처리시간 0.021초

국소선형 준가능도 추정량의 자료 희박성 문제 해결방안 (Sparse Design Problem in Local Linear Quasi-likelihood Estimator)

  • 박동련
    • 응용통계연구
    • /
    • 제20권1호
    • /
    • pp.133-145
    • /
    • 2007
  • 국소선형 추정량은 여러 면에서 바람직한 특성을 많이 갖고 있는 좋은 추정량이다. 그러나 자료가 희박한 부분에서는 매우 불안정한 추정값을 갖게 되는 문제가 있음이 밝혀졌으며, 이 문제를 해결하기 위한 여러 방안이 많이 연구되었다. 그러나 이항반응변수를 위한 국소선형 추정량의 변형이라고 할 수 있는 국소선형 준가능도 추정량에 대해서는 아직 자료의 희박성 문제가 다루어지지 않고 있었다. 이 논문에서는 국소선형 준가능도 추정량이 갖고 있는 자료의 희박성 문제를 인식하고, 몇 가지 해결방안을 제시하였으며, 모의 실험을 통하여 가장 효과적인 방안을 선택하였다.

The local influence of LIU type estimator in linear mixed model

  • Zhang, Lili;Baek, Jangsun
    • Journal of the Korean Data and Information Science Society
    • /
    • 제26권2호
    • /
    • pp.465-474
    • /
    • 2015
  • In this paper, we study the local influence analysis of LIU type estimator in the linear mixed models. Using the method proposed by Shi (1997), the local influence of LIU type estimator in three disturbance models are investigated respectively. Furthermore, we give the generalized Cook's distance to assess the influence, and illustrate the efficiency of the proposed method by example.

Shifted Nadaraya Watson Estimator

  • Chung, Sung-S.
    • Communications for Statistical Applications and Methods
    • /
    • 제4권3호
    • /
    • pp.881-890
    • /
    • 1997
  • The local linear estimator usually has more attractive properties than Nadaraya-Watson estimator. But the local linear estimator gives bad performance where data are sparse. Muller and Song proposed Shifted Nadaraya Watson estimator which has treated data sparsity well. We show that Shifted Nadaraya Watson estimator has good performance not only in the sparse region but also in the dense region, through the simulation study. Ans we suggest the boundary treatment of Shifted Nadaraya Watson estimator.

  • PDF

Modified Local Density Estimation for the Log-Linear Density

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • 제7권1호
    • /
    • pp.13-22
    • /
    • 2000
  • We consider local likelihood method with a smoothed version of the model density in stead of an original model density. For simplicity a model is assumed as the log-linear density then we were able to show that the proposed local density estimator is less affected by changes among observations but its bias increases little bit more than that of the currently used local density estimator. Hence if we use the existing method and the proposed method in a proper way we would derive the local density estimator fitting the data in a better way.

  • PDF

Local Influence of the Quasi-likelihood Estimators in Generalized Linear Models

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제14권1호
    • /
    • pp.229-239
    • /
    • 2007
  • We present a diagnostic method for the quasi-likelihood estimators in generalized linear models. Since these estimators can be usually obtained by iteratively reweighted least squares which are well known to be very sensitive to unusual data, a diagnostic step is indispensable to analysis of data. We extend the local influence approach based on the maximum likelihood function to that on the quasi-likelihood function. Under several perturbation schemes local influence diagnostics are derived. An illustrative example is given and we compare the results provided by local influence and deletion.

A Local Linear Kernel Estimator for Sparse Multinomial Data

  • Baek, Jangsun
    • Journal of the Korean Statistical Society
    • /
    • 제27권4호
    • /
    • pp.515-529
    • /
    • 1998
  • Burman (1987) and Hall and Titterington (1987) studied kernel smoothing for sparse multinomial data in detail. Both of their estimators for cell probabilities are sparse asymptotic consistent under some restrictive conditions on the true cell probabilities. Dong and Simonoff (1994) adopted boundary kernels to relieve the restrictive conditions. We propose a local linear kernel estimator which is popular in nonparametric regression to estimate cell probabilities. No boundary adjustment is necessary for this estimator since it adapts automatically to estimation at the boundaries. It is shown that our estimator attains the optimal rate of convergence in mean sum of squared error under sparseness. Some simulation results and a real data application are presented to see the performance of the estimator.

  • PDF

ON MARGINAL INTEGRATION METHOD IN NONPARAMETRIC REGRESSION

  • Lee, Young-Kyung
    • Journal of the Korean Statistical Society
    • /
    • 제33권4호
    • /
    • pp.435-447
    • /
    • 2004
  • In additive nonparametric regression, Linton and Nielsen (1995) showed that the marginal integration when applied to the local linear smoother produces a rate-optimal estimator of each univariate component function for the case where the dimension of the predictor is two. In this paper we give new formulas for the bias and variance of the marginal integration regression estimators which are valid for boundary areas as well as fixed interior points, and show the local linear marginal integration estimator is in fact rate-optimal when the dimension of the predictor is less than or equal to four. We extend the results to the case of the local polynomial smoother, too.

Penalized rank regression estimator with the smoothly clipped absolute deviation function

  • Park, Jong-Tae;Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제24권6호
    • /
    • pp.673-683
    • /
    • 2017
  • The least absolute shrinkage and selection operator (LASSO) has been a popular regression estimator with simultaneous variable selection. However, LASSO does not have the oracle property and its robust version is needed in the case of heavy-tailed errors or serious outliers. We propose a robust penalized regression estimator which provide a simultaneous variable selection and estimator. It is based on the rank regression and the non-convex penalty function, the smoothly clipped absolute deviation (SCAD) function which has the oracle property. The proposed method combines the robustness of the rank regression and the oracle property of the SCAD penalty. We develop an efficient algorithm to compute the proposed estimator that includes a SCAD estimate based on the local linear approximation and the tuning parameter of the penalty function. Our estimate can be obtained by the least absolute deviation method. We used an optimal tuning parameter based on the Bayesian information criterion and the cross validation method. Numerical simulation shows that the proposed estimator is robust and effective to analyze contaminated data.

Multiple Structural Change-Point Estimation in Linear Regression Models

  • Kim, Jae-Hee
    • Communications for Statistical Applications and Methods
    • /
    • 제19권3호
    • /
    • pp.423-432
    • /
    • 2012
  • This paper is concerned with the detection of multiple change-points in linear regression models. The proposed procedure relies on the local estimation for global change-point estimation. We propose a multiple change-point estimator based on the local least squares estimators for the regression coefficients and the split measure when the number of change-points is unknown. Its statistical properties are shown and its performance is assessed by simulations and real data applications.

On Convex Combination of Local Constant Regression

  • Mun, Jung-Won;Kim, Choong-Rak
    • Communications for Statistical Applications and Methods
    • /
    • 제13권2호
    • /
    • pp.379-387
    • /
    • 2006
  • Local polynomial regression is widely used because of good properties such as such as the adaptation to various types of designs, the absence of boundary effects and minimax efficiency Choi and Hall (1998) proposed an estimator of regression function using a convex combination idea. They showed that a convex combination of three local linear estimators produces an estimator which has the same order of bias as a local cubic smoother. In this paper we suggest another estimator of regression function based on a convex combination of five local constant estimates. It turned out that this estimator has the same order of bias as a local cubic smoother.