• 제목/요약/키워드: Local polynomial regression

검색결과 25건 처리시간 0.019초

Selection of Data-adaptive Polynomial Order in Local Polynomial Nonparametric Regression

  • Jo, Jae-Keun
    • Communications for Statistical Applications and Methods
    • /
    • 제4권1호
    • /
    • pp.177-183
    • /
    • 1997
  • A data-adaptive order selection procedure is proposed for local polynomial nonparametric regression. For each given polynomial order, bias and variance are estimated and the adaptive polynomial order that has the smallest estimated mean squared error is selected locally at each location point. To estimate mean squared error, empirical bias estimate of Ruppert (1995) and local polynomial variance estimate of Ruppert, Wand, Wand, Holst and Hossjer (1995) are used. Since the proposed method does not require fitting polynomial model of order higher than the model order, it is simpler than the order selection method proposed by Fan and Gijbels (1995b).

  • PDF

On Convex Combination of Local Constant Regression

  • Mun, Jung-Won;Kim, Choong-Rak
    • Communications for Statistical Applications and Methods
    • /
    • 제13권2호
    • /
    • pp.379-387
    • /
    • 2006
  • Local polynomial regression is widely used because of good properties such as such as the adaptation to various types of designs, the absence of boundary effects and minimax efficiency Choi and Hall (1998) proposed an estimator of regression function using a convex combination idea. They showed that a convex combination of three local linear estimators produces an estimator which has the same order of bias as a local cubic smoother. In this paper we suggest another estimator of regression function based on a convex combination of five local constant estimates. It turned out that this estimator has the same order of bias as a local cubic smoother.

ON MARGINAL INTEGRATION METHOD IN NONPARAMETRIC REGRESSION

  • Lee, Young-Kyung
    • Journal of the Korean Statistical Society
    • /
    • 제33권4호
    • /
    • pp.435-447
    • /
    • 2004
  • In additive nonparametric regression, Linton and Nielsen (1995) showed that the marginal integration when applied to the local linear smoother produces a rate-optimal estimator of each univariate component function for the case where the dimension of the predictor is two. In this paper we give new formulas for the bias and variance of the marginal integration regression estimators which are valid for boundary areas as well as fixed interior points, and show the local linear marginal integration estimator is in fact rate-optimal when the dimension of the predictor is less than or equal to four. We extend the results to the case of the local polynomial smoother, too.

A Study on Kernel Type Discontinuity Point Estimations

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • 제14권4호
    • /
    • pp.929-937
    • /
    • 2003
  • Kernel type estimations of discontinuity point at an unknown location in regression function or its derivatives have been developed. It is known that the discontinuity point estimator based on $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a zero value at the point 0 makes a poor asymptotic behavior. Further, the asymptotic variance of $Gasser-M\ddot{u}ller$ regression estimator in the random design case is 1.5 times larger that the one in the corresponding fixed design case, while those two are identical for the local polynomial regression estimator. Although $Gasser-M\ddot{u}ller$ regression estimator with a one-sided kernel function which has a non-zero value at the point 0 for the modification is used, computer simulation show that this phenomenon is also appeared in the discontinuity point estimation.

  • PDF

변수평활량을 이용한 커널회귀함수 추정 (On variable bandwidth Kernel Regression Estimation)

  • 석정하;정성석;김대학
    • Journal of the Korean Data and Information Science Society
    • /
    • 제9권2호
    • /
    • pp.179-188
    • /
    • 1998
  • 커널형 회귀함수의 추정법 중에서 국소 다항회귀 추정법이 가장 우수한 것으로 알려져 있다. 국소다항회귀 추정법에서도 다른 종류의 커널추정량과 마찬가지로 평활량이 중요한 역할을 한다. 특히 회귀함수가 복잡한 구조를 가질 때 변수평활량(variable band-width)을 사용하는 것이 타당할 것이다. 본 연구에서는 완전자료기저(fully automatic, fully data-driven) 변수평활량 선택법을 제안한다. 이 선택법은 편향과 분산의 예비추정에 필요한 평활량을 교차타당성 방법으로 선택하여 MSE를 추정하고 그 값을 최소화하는 평활량을 택하는 것이다. 제안된 방법의 우수성을 모의실험을 통하여 확인하였다. 그리고 제안된 방법은 자료점이 성긴(sparse)부분에서 생길 수 있는 문제점 즉 X'X의 비정칙성(non-singularity)을 해결할 수 있는 방법이라는 데에도 큰 의미가 있다.

  • PDF

Study on semi-supervised local constant regression estimation

  • Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권3호
    • /
    • pp.579-585
    • /
    • 2012
  • Many different semi-supervised learning algorithms have been proposed for use wit unlabeled data. However, most of them focus on classification problems. In this paper we propose a semi-supervised regression algorithm called the semi-supervised local constant estimator (SSLCE), based on the local constant estimator (LCE), and reveal the asymptotic properties of SSLCE. We also show that the SSLCE has a faster convergence rate than that of the LCE when a well chosen weighting factor is employed. Our experiment with synthetic data shows that the SSLCE can improve performance with unlabeled data, and we recommend its use with the proper size of unlabeled data.

Estimation of Density via Local Polynomial Tegression

  • Park, B. U.;Kim, W. C.;J. Huh;J. W. Jeon
    • Journal of the Korean Statistical Society
    • /
    • 제27권1호
    • /
    • pp.91-100
    • /
    • 1998
  • A method of estimating probability density using regression tools is presented here. It is based on equal-length binning and locally weighted approximate likelihood for bin counts. The method is particularly useful for densities with bounded supports, where it automatically corrects edge effects without using boundary kernels.

  • PDF

Robust Nonparametric Regression Method using Rank Transformation

    • Communications for Statistical Applications and Methods
    • /
    • 제7권2호
    • /
    • pp.574-574
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

Robust Nonparametric Regression Method using Rank Transformation

  • Park, Dongryeon
    • Communications for Statistical Applications and Methods
    • /
    • 제7권2호
    • /
    • pp.575-583
    • /
    • 2000
  • Consider the problem of estimating regression function from a set of data which is contaminated by a long-tailed error distribution. The linear smoother is a kind of a local weighted average of response, so it is not robust against outliers. The kernel M-smoother and the lowess attain robustness against outliers by down-weighting outliers. However, the kernel M-smoother and the lowess requires the iteration for computing the robustness weights, and as Wang and Scott(1994) pointed out, the requirement of iteration is not a desirable property. In this article, we propose the robust nonparametic regression method which does not require the iteration. Robustness can be achieved not only by down-weighting outliers but also by transforming outliers. The rank transformation is a simple procedure where the data are replaced by their corresponding ranks. Iman and Conover(1979) showed the fact that the rank transformation is a robust and powerful procedure in the linear regression. In this paper, we show that we can also use the rank transformation to nonparametric regression to achieve the robustness.

  • PDF

Bandwidth Selection for Local Smoothing Jump Detector

  • Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • 제16권6호
    • /
    • pp.1047-1054
    • /
    • 2009
  • Local smoothing jump detection procedure is a popular method for detecting jump locations and the performance of the jump detector heavily depends on the choice of the bandwidth. However, little work has been done on this issue. In this paper, we propose the bootstrap bandwidth selection method which can be used for any kernel-based or local polynomial-based jump detector. The proposed bandwidth selection method is fully data-adaptive and its performance is evaluated through a simulation study and a real data example.