• Title/Summary/Keyword: Least-squares Regression

Search Result 563, Processing Time 0.024 seconds

DETECTION OF OUTLIERS IN WEIGHTED LEAST SQUARES REGRESSION

  • Shon, Bang-Yong;Kim, Guk-Boh
    • Journal of applied mathematics & informatics
    • /
    • v.4 no.2
    • /
    • pp.501-512
    • /
    • 1997
  • In multiple linear regression model we have presupposed assumptions (independence normality variance homogeneity and so on) on error term. When case weights are given because of variance heterogeneity we can estimate efficiently regression parameter using weighted least squares estimator. Unfortunately this estimator is sen-sitive to outliers like ordinary least squares estimator. Thus in this paper we proposed some statistics for detection of outliers in weighted least squares regression.

Asymmetric least squares regression estimation using weighted least squares support vector machine

  • Hwan, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.5
    • /
    • pp.999-1005
    • /
    • 2011
  • This paper proposes a weighted least squares support vector machine for asymmetric least squares regression. This method achieves nonlinear prediction power, while making no assumption on the underlying probability distributions. The cross validation function is introduced to choose optimal hyperparameters in the procedure. Experimental results are then presented which indicate the performance of the proposed model.

Asymmetric Least Squares Estimation for A Nonlinear Time Series Regression Model

  • Kim, Tae Soo;Kim, Hae Kyoung;Yoon, Jin Hee
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.3
    • /
    • pp.633-641
    • /
    • 2001
  • The least squares method is usually applied when estimating the parameters in the regression models. However the least square estimator is not very efficient when the distribution of the error is skewed. In this paper, we propose the asymmetric least square estimator for a particular nonlinear time series regression model, and give the simple and practical sufficient conditions for the strong consistency of the estimators.

  • PDF

NONLINEAR ASYMMETRIC LEAST SQUARES ESTIMATORS

  • Park, Seung-Hoe;Kim, Hae-Kyung;Lee, Young
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.1
    • /
    • pp.47-64
    • /
    • 2003
  • In this paper, we consider the asymptotic properties of asymmetric least squares estimators for nonlinear regression models. This paper provides sufficient conditions for strong consistency and asymptotic normality of the proposed estimators and derives asymptotic relative efficiency of the pro-posed estimators to the regression quantile estimators. We give some examples and results of a Monte Carlo simulation to compare the asymmetric least squares estimators with the regression quantile estimators.

Noisy label based discriminative least squares regression and its kernel extension for object identification

  • Liu, Zhonghua;Liu, Gang;Pu, Jiexin;Liu, Shigang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.5
    • /
    • pp.2523-2538
    • /
    • 2017
  • In most of the existing literature, the definition of the class label has the following characteristics. First, the class label of the samples from the same object has an absolutely fixed value. Second, the difference between class labels of the samples from different objects should be maximized. However, the appearance of a face varies greatly due to the variations of the illumination, pose, and expression. Therefore, the previous definition of class label is not quite reasonable. Inspired by discriminative least squares regression algorithm (DLSR), a noisy label based discriminative least squares regression algorithm (NLDLSR) is presented in this paper. In our algorithm, the maximization difference between the class labels of the samples from different objects should be satisfied. Meanwhile, the class label of the different samples from the same object is allowed to have small difference, which is consistent with the fact that the different samples from the same object have some differences. In addition, the proposed NLDLSR is expanded to the kernel space, and we further propose a novel kernel noisy label based discriminative least squares regression algorithm (KNLDLSR). A large number of experiments show that our proposed algorithms can achieve very good performance.

Unified Non-iterative Algorithm for Principal Component Regression, Partial Least Squares and Ordinary Least Squares

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.2
    • /
    • pp.355-366
    • /
    • 2003
  • A unified procedure for principal component regression (PCR), partial least squares (PLS) and ordinary least squares (OLS) is proposed. The process gives solutions for PCR, PLS and OLS in a unified and non-iterative way. This enables us to see the interrelationships among the three regression coefficient vectors, and it is seen that the so-called E-matrix in the solution expression plays the key role in differentiating the methods. In addition to setting out the procedure, the paper also supplies a robust numerical algorithm for its implementation, which is used to show how the procedure performs on a real world data set.

  • PDF

Shrinkage Structure of Ridge Partial Least Squares Regression

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.2
    • /
    • pp.327-344
    • /
    • 2007
  • Ridge partial least squares regression (RPLS) is a regression method which can be obtained by combining ridge regression and partial least squares regression and is intended to provide better predictive ability and less sensitive to overfitting. In this paper, explicit expressions for the shrinkage factor of RPLS are developed. The structure of the shrinkage factor is explored and compared with those of other biased regression methods, such as ridge regression, principal component regression, ridge principal component regression, and partial least squares regression using a near infrared data set.

  • PDF

REGRESSION WITH CENSORED DATA BY LEAST SQUARES SUPPORT VECTOR MACHINE

  • Kim, Dae-Hak;Shim, Joo-Yong;Oh, Kwang-Sik
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.1
    • /
    • pp.25-34
    • /
    • 2004
  • In this paper we propose a prediction method on the regression model with randomly censored observations of the training data set. The least squares support vector machine regression is applied for the regression function prediction by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed prediction method.

Two-step LS-SVR for censored regression

  • Bae, Jong-Sig;Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.2
    • /
    • pp.393-401
    • /
    • 2012
  • This paper deals with the estimations of the least squares support vector regression when the responses are subject to randomly right censoring. The estimation is performed via two steps - the ordinary least squares support vector regression and the least squares support vector regression with censored data. We use the empirical fact that the estimated regression functions subject to randomly right censoring are close to the true regression functions than the observed failure times subject to randomly right censoring. The hyper-parameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation function. Experimental results are then presented which indicate the performance of the proposed procedure.

An Equivariant and Robust Estimator in Multivariate Regression Based on Least Trimmed Squares

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1037-1046
    • /
    • 2003
  • We propose an equivariant and robust estimator in multivariate regression model based on the least trimmed squares (LTS) estimator in univariate regression. We call this estimator as multivariate least trimmed squares (MLTS) estimator. The MLTS estimator considers correlations among response variables and it can be shown that the proposed estimator has the appropriate equivariance properties defined in multivariate regression. The MLTS estimator has high breakdown point as does LTS estimator in univariate case. We develop an algorithm for MLTS estimate. Simulation are performed to compare the efficiencies of MLTS estimate with coordinatewise LTS estimate and a numerical example is given to illustrate the effectiveness of MLTS estimate in multivariate regression.