• 제목/요약/키워드: Least-squares Regression

검색결과 563건 처리시간 0.02초

DETECTION OF OUTLIERS IN WEIGHTED LEAST SQUARES REGRESSION

  • Shon, Bang-Yong;Kim, Guk-Boh
    • Journal of applied mathematics & informatics
    • /
    • 제4권2호
    • /
    • pp.501-512
    • /
    • 1997
  • In multiple linear regression model we have presupposed assumptions (independence normality variance homogeneity and so on) on error term. When case weights are given because of variance heterogeneity we can estimate efficiently regression parameter using weighted least squares estimator. Unfortunately this estimator is sen-sitive to outliers like ordinary least squares estimator. Thus in this paper we proposed some statistics for detection of outliers in weighted least squares regression.

Asymmetric least squares regression estimation using weighted least squares support vector machine

  • Hwan, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권5호
    • /
    • pp.999-1005
    • /
    • 2011
  • This paper proposes a weighted least squares support vector machine for asymmetric least squares regression. This method achieves nonlinear prediction power, while making no assumption on the underlying probability distributions. The cross validation function is introduced to choose optimal hyperparameters in the procedure. Experimental results are then presented which indicate the performance of the proposed model.

Asymmetric Least Squares Estimation for A Nonlinear Time Series Regression Model

  • Kim, Tae Soo;Kim, Hae Kyoung;Yoon, Jin Hee
    • Communications for Statistical Applications and Methods
    • /
    • 제8권3호
    • /
    • pp.633-641
    • /
    • 2001
  • The least squares method is usually applied when estimating the parameters in the regression models. However the least square estimator is not very efficient when the distribution of the error is skewed. In this paper, we propose the asymmetric least square estimator for a particular nonlinear time series regression model, and give the simple and practical sufficient conditions for the strong consistency of the estimators.

  • PDF

NONLINEAR ASYMMETRIC LEAST SQUARES ESTIMATORS

  • Park, Seung-Hoe;Kim, Hae-Kyung;Lee, Young
    • Journal of the Korean Statistical Society
    • /
    • 제32권1호
    • /
    • pp.47-64
    • /
    • 2003
  • In this paper, we consider the asymptotic properties of asymmetric least squares estimators for nonlinear regression models. This paper provides sufficient conditions for strong consistency and asymptotic normality of the proposed estimators and derives asymptotic relative efficiency of the pro-posed estimators to the regression quantile estimators. We give some examples and results of a Monte Carlo simulation to compare the asymmetric least squares estimators with the regression quantile estimators.

Noisy label based discriminative least squares regression and its kernel extension for object identification

  • Liu, Zhonghua;Liu, Gang;Pu, Jiexin;Liu, Shigang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제11권5호
    • /
    • pp.2523-2538
    • /
    • 2017
  • In most of the existing literature, the definition of the class label has the following characteristics. First, the class label of the samples from the same object has an absolutely fixed value. Second, the difference between class labels of the samples from different objects should be maximized. However, the appearance of a face varies greatly due to the variations of the illumination, pose, and expression. Therefore, the previous definition of class label is not quite reasonable. Inspired by discriminative least squares regression algorithm (DLSR), a noisy label based discriminative least squares regression algorithm (NLDLSR) is presented in this paper. In our algorithm, the maximization difference between the class labels of the samples from different objects should be satisfied. Meanwhile, the class label of the different samples from the same object is allowed to have small difference, which is consistent with the fact that the different samples from the same object have some differences. In addition, the proposed NLDLSR is expanded to the kernel space, and we further propose a novel kernel noisy label based discriminative least squares regression algorithm (KNLDLSR). A large number of experiments show that our proposed algorithms can achieve very good performance.

Unified Non-iterative Algorithm for Principal Component Regression, Partial Least Squares and Ordinary Least Squares

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • 제14권2호
    • /
    • pp.355-366
    • /
    • 2003
  • A unified procedure for principal component regression (PCR), partial least squares (PLS) and ordinary least squares (OLS) is proposed. The process gives solutions for PCR, PLS and OLS in a unified and non-iterative way. This enables us to see the interrelationships among the three regression coefficient vectors, and it is seen that the so-called E-matrix in the solution expression plays the key role in differentiating the methods. In addition to setting out the procedure, the paper also supplies a robust numerical algorithm for its implementation, which is used to show how the procedure performs on a real world data set.

  • PDF

Shrinkage Structure of Ridge Partial Least Squares Regression

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권2호
    • /
    • pp.327-344
    • /
    • 2007
  • 다중공선성의 데이터에 사용되는 대표적인 편향회귀방법은 능형회귀(RR), 주성분회귀(PCR), 부분최소제곱회귀(PLS) 등이다. 이 회귀방법들은 계수베거 추정량의 놈(norm)이 모두 보통 최소제곱회귀(OLS)의 추정량의 놈보다 작아진다는 의미에서 축소회귀라 부른다. 새로운 회귀방법으로 RR과 PCR을 결합한 능형주성분회귀(RPCR)가 있고 RR과 PLS를 결합한 능형부분최소제곱회귀(RPLS)가 있으며 이들도 또한 축소회귀이다. 이들 추정량은 X'X의 고유벡터들의 선형결합으로 나타낼 수 있고 따라서 각 고유방향에서 OLS에 비해 얼마나 축소되는지를 연구할 수 있다. 본 논문에서는 먼저 이들 추정량을 일반적인 축소인자의 식으로 나타내고 이를 이용하여 MSE의 일반식을 구하였으며 PLS 추정량의 MSE 식도 구하였다. 그리고 RPLS의 축소인자 식을 두 가지 다른 형태로 유도하였다. RPLS의 경우도 이 축소인자 식을 MSE의 일반식에 대입하면 MSE 식이 바로 얻어진다. 그러나 PLS나 RPLS의 축소인자는 y의 복잡한 비선형이 되어 결정적이 아니므로 이들 추정량의 MSE는 근사적인 식이라 할 수 있다. 따라서 PLS나 RPLS를 평가하기 위해 이 MSE를 사용하는 것은 제한적이며, 경험적인 방법으로 이들 회귀의 수행성을 평가하는 것이 필요하다. 다중공선성의 대표적인 데이터인 근적외선 분광 데이터를 이용하여 이 유도된 회귀의 축소인자 값이 인자수에 따라 어떻게 변화하는지와 전체적인 축소 비율도 살펴보았다. 이들의 축소 형태를 잘 이해하면 회귀방법들의 예측력과 안정성을 파악하는데 많은 도움이 되리라 판단된다.

  • PDF

REGRESSION WITH CENSORED DATA BY LEAST SQUARES SUPPORT VECTOR MACHINE

  • Kim, Dae-Hak;Shim, Joo-Yong;Oh, Kwang-Sik
    • Journal of the Korean Statistical Society
    • /
    • 제33권1호
    • /
    • pp.25-34
    • /
    • 2004
  • In this paper we propose a prediction method on the regression model with randomly censored observations of the training data set. The least squares support vector machine regression is applied for the regression function prediction by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed prediction method.

Two-step LS-SVR for censored regression

  • Bae, Jong-Sig;Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제23권2호
    • /
    • pp.393-401
    • /
    • 2012
  • This paper deals with the estimations of the least squares support vector regression when the responses are subject to randomly right censoring. The estimation is performed via two steps - the ordinary least squares support vector regression and the least squares support vector regression with censored data. We use the empirical fact that the estimated regression functions subject to randomly right censoring are close to the true regression functions than the observed failure times subject to randomly right censoring. The hyper-parameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation function. Experimental results are then presented which indicate the performance of the proposed procedure.

An Equivariant and Robust Estimator in Multivariate Regression Based on Least Trimmed Squares

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제10권3호
    • /
    • pp.1037-1046
    • /
    • 2003
  • We propose an equivariant and robust estimator in multivariate regression model based on the least trimmed squares (LTS) estimator in univariate regression. We call this estimator as multivariate least trimmed squares (MLTS) estimator. The MLTS estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regression. The MLTS estimator has high breakdown point as does LTS estimator in univariate case. We develop an algorithm for MLTS estimate. Simulation are performed to compare the efficiencies of MLTS estimate with coordinatewise LTS estimate and a numerical example is given to illustrate the effectiveness of MLTS estimate in multivariate regression.