• Title/Summary/Keyword: Regression estimators

Search Result 226, Processing Time 0.019 seconds

Robust Regression for Right-Censored Data

  • Kim, Chul-Ki
    • Journal of Korean Society for Quality Management
    • /
    • v.25 no.2
    • /
    • pp.47-59
    • /
    • 1997
  • In this paper we develop computational algorithms to calculate M-estimators of regression parameters from right-censored data that are naturally collected in quality control. In the case of M-estimators, a new statistical method is also introduced to incorporate concomitant scale estimation in the presence of right censoring on the observed responses. Furthermore, we illustrate this by simulations.

  • PDF

Two Bootstrap Confidence Intervals of Ridge Regression Estimators in Mixture Experiments (혼합물실험에서 능형회귀추정량에 대한 두 종류의 붓스트랩 신뢰구간)

  • Jang Dae-Heung
    • The Korean Journal of Applied Statistics
    • /
    • v.19 no.2
    • /
    • pp.339-347
    • /
    • 2006
  • In mixture experiments, performing experiments in highly constrained regions causes collinearity problems. We can use the ridge regression as a means for stabilizing the coefficient estimators in the fitted model. But there is no theory available on which to base statistical inference of ridge estimators. The bootstrap technique could be used to seek the confidence intervals for ridge estimators.

A Study on Change of Logistics in the region of Seoul, Incheon, Kyunggi (물류예측모형에 관한 연구 -수도권 물동량 예측을 중심으로-)

  • Roh Kyung-Ho
    • Management & Information Systems Review
    • /
    • v.7
    • /
    • pp.427-450
    • /
    • 2001
  • This research suggests the estimation methodology of Logistics. This paper elucidates the main problems associated with estimation in the regression model. We review the methods for estimating the parameters in the model and introduce a modified procedure in which all models are fitted and combined to construct a combination of estimates. The resulting estimators are found to be as efficient as the maximum likelihood (ML) estimators in various cases. Our method requires more computations but has an advantage for large data sets. Also, it enables to detect particular features in the data structure. Examples of real data are used to illustrate the properties of the estimators. The backgrounds of estimation of logistic regression model is the increasing logistic environment importance today. In the first phase, we conduct an exploratory study to discuss 9 independent variables. In the second phase, we try to find the fittest logistic regression model. In the third phase, we calculate the logistic estimation using logistic regression model. The parameters of logistic regression model were estimated using ordinary least squares regression. The standard assumptions of OLS estimation were tested. The calculated value of the F-statistics for the logistic regression model is significant at the 5% level. The logistic regression model also explains a significant amount of variance in the dependent variable. The parameter estimates of the logistic regression model with t-statistics in parentheses are presented in Table. The object of this paper is to find the best logistic regression model to estimate the comparative accurate logistics.

  • PDF

Approximate Variance of Least Square Estimators for Regression Coefficient under Inclusion Probability Proportional to Size Sampling (포함확률비례추출에서 회귀계수 최소제곱추정량의 근사분산)

  • Kim, Kyu-Seong
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.1
    • /
    • pp.23-32
    • /
    • 2012
  • This paper deals with the bias and variance of regression coefficient estimators in a finite population. We derive approximate formulas for the bias, variance and mean square error of two estimators when we select a fixed-size inclusion probability proportional to the size sample and then estimate regression coefficients by the ordinary least square estimator as well as the weighted least square estimator based on the selected sample data. Necessary and sufficient conditions for the comparison of the two estimators in terms of variance and mean square error are suggested. In addition, a simple example is introduced to numerically compare the variance and mean square error of the two estimators.

First Order Difference-Based Error Variance Estimator in Nonparametric Regression with a Single Outlier

  • Park, Chun-Gun
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.3
    • /
    • pp.333-344
    • /
    • 2012
  • We consider some statistical properties of the first order difference-based error variance estimator in nonparametric regression models with a single outlier. So far under an outlier(s) such difference-based estimators has been rarely discussed. We propose the first order difference-based estimator using the leave-one-out method to detect a single outlier and simulate the outlier detection in a nonparametric regression model with the single outlier. Moreover, the outlier detection works well. The results are promising even in nonparametric regression models with many outliers using some difference based estimators.

Asymmetric Least Squares Estimation for A Nonlinear Time Series Regression Model

  • Kim, Tae Soo;Kim, Hae Kyoung;Yoon, Jin Hee
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.3
    • /
    • pp.633-641
    • /
    • 2001
  • The least squares method is usually applied when estimating the parameters in the regression models. However the least square estimator is not very efficient when the distribution of the error is skewed. In this paper, we propose the asymmetric least square estimator for a particular nonlinear time series regression model, and give the simple and practical sufficient conditions for the strong consistency of the estimators.

  • PDF

ROBUST CROSS VALIDATIONS IN RIDGE REGRESSION

  • Jung, Kang-Mo
    • Journal of applied mathematics & informatics
    • /
    • v.27 no.3_4
    • /
    • pp.903-908
    • /
    • 2009
  • The shrink parameter in ridge regression may be contaminated by outlying points. We propose robust cross validation scores in ridge regression instead of classical cross validation. We use robust location estimators such as median, least trimmed squares, absolute mean for robust cross validation scores. The robust scores have global robustness. Simulations are performed to show the effectiveness of the proposed estimators.

  • PDF

The Comparison Analysis of an Estimators of Nonlinear Regression Model using Monte Carlo Simulation (몬테칼로 시뮬레이션을 이용한 비선형회귀추정량들의 비교 분석)

  • 김태수;이영해
    • Journal of the Korea Society for Simulation
    • /
    • v.9 no.3
    • /
    • pp.43-51
    • /
    • 2000
  • In regression model, we estimate the unknown parameters by using various methods. There are the least squares method which is the most general, the least absolute deviation method, the regression quantile method and the asymmetric least squares method. In this paper, we will compare each others with two cases: firstly the theoretical comparison in the asymptotic sense and then the practical comparison using Monte Carlo simulation for a small sample size.

  • PDF

Monte Carlo simulation of the estimators for nonlinear regression model (비선형 회귀모형 추정량들의 몬데칼로 시뮬레이션에 의한 비교)

  • 김태수;이영해
    • Proceedings of the Korea Society for Simulation Conference
    • /
    • 2000.11a
    • /
    • pp.6-10
    • /
    • 2000
  • In regression model we estimate the unknown parameters using various methods. There are the least squares method which is the most general, the least absolute deviation, the regression quantile and the asymmetric least squares method. In this paper, we will compare each others with two case: to begin with the theoretical comparison in the asymptotic sense, and then the practical comparison using Monte Carlo simulation for a small sample size.

  • PDF

Bootstrap Confidence Intervals of Ridge Estimators in Mixture Experiments (혼합물실험에서 능형추정량에 대한 붓스트랩 신뢰구간)

  • Jang, Dae-Heung
    • Journal of Korean Society for Quality Management
    • /
    • v.34 no.3
    • /
    • pp.62-65
    • /
    • 2006
  • We can use the ridge regression as a means for stabilizing the coefficient estimators in the fitted model when performing experiments in highly constrained regions causes collinearity problems in mixture experiments. But there is no theory available on which to base statistical inference of ridge estimators. The bootstrap could be used to seek the confidence intervals of ridge estimators.