• Title/Summary/Keyword: Regression estimator

Search Result 311, Processing Time 0.025 seconds

Asymptotic Properties of Least Square Estimator of Disturbance Variance in the Linear Regression Model with MA(q)-Disturbances

  • Jong Hyup Lee;Seuck Heum Song
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.111-117
    • /
    • 1997
  • The ordinary least squares estimator $S^2$ for the variance of the disturbances is considered in the linear regression model with sutocorrelated disturbances. It is proved that the OLS-estimator of disturbance variance is asymptotically unbiased and weakly consistent, when the distrubances are generated by an MA(q) process. In particular, the asymptotic unbiasedness and consistency of $S^2$ is satisfied without any restriction on the regressor matrix.

  • PDF

Weighted Least Absolute Error Estimation of Regression Parameters

  • Song, Moon-Sup
    • Journal of the Korean Statistical Society
    • /
    • v.8 no.1
    • /
    • pp.23-36
    • /
    • 1979
  • In the multiple linear regression model a class of weighted least absolute error estimaters, which minimize the sum of weighted absolute residuals, is proposed. It is shown that the weighted least absolute error estimators with Wilcoxon scores are equivalent to the Koul's Wilcoxon type estimator. Therefore, the asymptotic efficiency of the proposed estimator with Wilcoxon scores relative to the least squares estimator is the same as the Pitman efficiency of the Wilcoxon test relative to the Student's t-test. To find the estimates the iterative weighted least squares method suggested by Schlossmacher is applicable.

  • PDF

Usage of auxiliary variable and neural network in doubly robust estimation

  • Park, Hyeonah;Park, Wonjun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.3
    • /
    • pp.659-667
    • /
    • 2013
  • If the regression model or the propensity model is correct, the unbiasedness of the estimator using doubly robust imputation can be guaranteed. Using a neural network instead of a logistic regression model for the propensity model, the estimators using doubly robust imputation are approximately unbiased even though both assumed models fail. We also propose a doubly robust estimator of ratio form using population information of an auxiliary variable. We prove some properties of proposed theory by restricted simulations.

Bayes Estimation in a Hierarchical Linear Model

  • Park, Kuey-Chung;Chang, In-Hong;Kim, Byung-Hwee
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.1
    • /
    • pp.1-10
    • /
    • 1998
  • In the problem of estimating a vector of unknown regression coefficients under the sum of squared error losses in a hierarchical linear model, we propose the hierarchical Bayes estimator of a vector of unknown regression coefficients in a hierarchical linear model, and then prove the admissibility of this estimator using Blyth's (196\51) method.

  • PDF

Expressions for Shrinkage Factors of PLS Estimator

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.4
    • /
    • pp.1169-1180
    • /
    • 2006
  • Partial least squares regression (PLS) is a biased, non-least squares regression method and is an alternative to the ordinary least squares regression (OLS) when predictors are highly collinear or predictors outnumber observations. One way to understand the properties of biased regression methods is to know how the estimators shrink the OLS estimator. In this paper, we introduce an expression for the shrinkage factor of PLS and develop a new shrinkage expression, and then prove the equivalence of the two representations. We use two near-infrared (NIR) data sets to show general behavior of the shrinkage and in particular for what eigendirections PLS expands the OLS coefficients.

  • PDF

Asymmetric Least Squares Estimation for A Nonlinear Time Series Regression Model

  • Kim, Tae Soo;Kim, Hae Kyoung;Yoon, Jin Hee
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.3
    • /
    • pp.633-641
    • /
    • 2001
  • The least squares method is usually applied when estimating the parameters in the regression models. However the least square estimator is not very efficient when the distribution of the error is skewed. In this paper, we propose the asymmetric least square estimator for a particular nonlinear time series regression model, and give the simple and practical sufficient conditions for the strong consistency of the estimators.

  • PDF

A Local Linear Kernel Estimator for Sparse Multinomial Data

  • Baek, Jangsun
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.4
    • /
    • pp.515-529
    • /
    • 1998
  • Burman (1987) and Hall and Titterington (1987) studied kernel smoothing for sparse multinomial data in detail. Both of their estimators for cell probabilities are sparse asymptotic consistent under some restrictive conditions on the true cell probabilities. Dong and Simonoff (1994) adopted boundary kernels to relieve the restrictive conditions. We propose a local linear kernel estimator which is popular in nonparametric regression to estimate cell probabilities. No boundary adjustment is necessary for this estimator since it adapts automatically to estimation at the boundaries. It is shown that our estimator attains the optimal rate of convergence in mean sum of squared error under sparseness. Some simulation results and a real data application are presented to see the performance of the estimator.

  • PDF

Bezier curve smoothing of cumulative hazard function estimators

  • Cha, Yongseb;Kim, Choongrak
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.3
    • /
    • pp.189-201
    • /
    • 2016
  • In survival analysis, the Nelson-Aalen estimator and Peterson estimator are often used to estimate a cumulative hazard function in randomly right censored data. In this paper, we suggested the smoothing version of the cumulative hazard function estimators using a Bezier curve. We compare them with the existing estimators including a kernel smooth version of the Nelson-Aalen estimator and the Peterson estimator in the sense of mean integrated square error to show through numerical studies that the proposed estimators are better than existing ones. Further, we applied our method to the Cox regression where covariates are used as predictors and suggested a survival function estimation at a given covariate.

LIL FOR KERNEL ESTIMATOR OF ERROR DISTRIBUTION IN REGRESSION MODEL

  • Niu, Si-Li
    • Journal of the Korean Mathematical Society
    • /
    • v.44 no.4
    • /
    • pp.835-844
    • /
    • 2007
  • This paper considers the problem of estimating the error distribution function in nonparametric regression models. Sufficient conditions are given under which the kernel estimator of the error distribution function based on nonparametric residuals satisfies the law of iterated logarithm.

Study on semi-supervised local constant regression estimation

  • Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.3
    • /
    • pp.579-585
    • /
    • 2012
  • Many different semi-supervised learning algorithms have been proposed for use wit unlabeled data. However, most of them focus on classification problems. In this paper we propose a semi-supervised regression algorithm called the semi-supervised local constant estimator (SSLCE), based on the local constant estimator (LCE), and reveal the asymptotic properties of SSLCE. We also show that the SSLCE has a faster convergence rate than that of the LCE when a well chosen weighting factor is employed. Our experiment with synthetic data shows that the SSLCE can improve performance with unlabeled data, and we recommend its use with the proper size of unlabeled data.