• Title/Summary/Keyword: linear regression with constraints

Search Result 22, Processing Time 0.029 seconds

Test for an Outlier in Multivariate Regression with Linear Constraints

  • Kim, Myung-Geun
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.2
    • /
    • pp.473-478
    • /
    • 2002
  • A test for a single outlier in multivariate regression with linear constraints on regression coefficients using a mean shift model is derived. It is shown that influential observations based on case-deletions in testing linear hypotheses are determined by two types of outliers that are mean shift outliers with or without linear constraints, An illustrative example is given.

Bayesian Variable Selection in Linear Regression Models with Inequality Constraints on the Coefficients (제한조건이 있는 선형회귀 모형에서의 베이지안 변수선택)

  • 오만숙
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.1
    • /
    • pp.73-84
    • /
    • 2002
  • Linear regression models with inequality constraints on the coefficients are frequently used in economic models due to sign or order constraints on the coefficients. In this paper, we propose a Bayesian approach to selecting significant explanatory variables in linear regression models with inequality constraints on the coefficients. Bayesian variable selection requires computation of posterior probability of each candidate model. We propose a method which computes all the necessary posterior model probabilities simultaneously. In specific, we obtain posterior samples form the most general model via Gibbs sampling algorithm (Gelfand and Smith, 1990) and compute the posterior probabilities by using the samples. A real example is given to illustrate the method.

Testing General Linear Constraints on the Regression Coefficient Vector : A Note

  • Jeong, Ki-Jun
    • Journal of the Korean Statistical Society
    • /
    • v.8 no.2
    • /
    • pp.107-109
    • /
    • 1979
  • Consider a linear model with n observations and k explanatory variables: (1)b $y=X\beta+u, u\simN(0,\sigma^2I_n)$. We assume that the model satisfies the ideal conditions. Consider the general linear constraints on regression coefficient vector: (2) $R\beta=r$, where R and r are known matrices of orders $q\timesk$ and q\times1$ respectively, and the rank of R is $qk+q$.

  • PDF

A comparison study of multiple linear quantile regression using non-crossing constraints (비교차 제약식을 이용한 다중 선형 분위수 회귀모형에 관한 비교연구)

  • Bang, Sungwan;Shin, Seung Jun
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.5
    • /
    • pp.773-786
    • /
    • 2016
  • Multiple quantile regression that simultaneously estimate several conditional quantiles of response given covariates can provide a comprehensive information about the relationship between the response and covariates. Some quantile estimates can cross if conditional quantiles are separately estimated; however, this violates the definition of the quantile. To tackle this issue, multiple quantile regression with non-crossing constraints have been developed. In this paper, we carry out a comparison study on several popular methods for non-crossing multiple linear quantile regression to provide practical guidance on its application.

LIKELIHOOD DISTANCE IN CONSTRAINED REGRESSION

  • Kim, Myung-Geun
    • Journal of applied mathematics & informatics
    • /
    • v.25 no.1_2
    • /
    • pp.489-493
    • /
    • 2007
  • Two diagnostic measures based on the likelihood distance for constrained regression with linear constraints on regression coefficients are derived. They are used for identifying influential observations in constrained regression. A numerical example is provided for illustration.

Constrained $L_1$-Estimation in Linear Regression

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.3
    • /
    • pp.581-589
    • /
    • 1998
  • An algorithm is proposed for the $L_1$-estimation with linear equality and inequality constraints in linear regression model. The algorithm employs a linear scaling transformation to obtain the optimal solution of linear programming type problem. And a special scheme is used to maintain the feasibility of the updated solution at each iteration. The convergence of the proposed algorithm is proved. In addition, the updating and orthogonal decomposition techniques are employed to improve the computational efficiency and numerical stability.

  • PDF

Algorithm for the Constrained Chebyshev Estimation in Linear Regression

  • Kim, Bu-yong
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.1
    • /
    • pp.47-54
    • /
    • 2000
  • This article is concerned with the algorithm for the Chebyshev estimation with/without linear equality and/or inequality constraints. The algorithm employs a linear scaling transformation scheme to reduce the computational burden which is induced when the data set is quite large. The convergence of the proposed algorithm is proved. And the updating and orthogonal decomposition techniques are considered to improve the computational efficiency and numerical stability.

  • PDF

Bayesian inference for an ordered multiple linear regression with skew normal errors

  • Jeong, Jeongmun;Chung, Younshik
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.2
    • /
    • pp.189-199
    • /
    • 2020
  • This paper studies a Bayesian ordered multiple linear regression model with skew normal error. It is reasonable that the kind of inherent information available in an applied regression requires some constraints on the coefficients to be estimated. In addition, the assumption of normality of the errors is sometimes not appropriate in the real data. Therefore, to explain such situations more flexibly, we use the skew-normal distribution given by Sahu et al. (The Canadian Journal of Statistics, 31, 129-150, 2003) for error-terms including normal distribution. For Bayesian methodology, the Markov chain Monte Carlo method is employed to resolve complicated integration problems. Also, under the improper priors, the propriety of the associated posterior density is shown. Our Bayesian proposed model is applied to NZAPB's apple data. For model comparison between the skew normal error model and the normal error model, we use the Bayes factor and deviance information criterion given by Spiegelhalter et al. (Journal of the Royal Statistical Society Series B (Statistical Methodology), 64, 583-639, 2002). We also consider the problem of detecting an influential point concerning skewness using Bayes factors. Finally, concluding remarks are discussed.

Patch based Semi-supervised Linear Regression for Face Recognition

  • Ding, Yuhua;Liu, Fan;Rui, Ting;Tang, Zhenmin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.8
    • /
    • pp.3962-3980
    • /
    • 2019
  • To deal with single sample face recognition, this paper presents a patch based semi-supervised linear regression (PSLR) algorithm, which draws facial variation information from unlabeled samples. Each facial image is divided into overlapped patches, and a regression model with mapping matrix will be constructed on each patch. Then, we adjust these matrices by mapping unlabeled patches to $[1,1,{\cdots},1]^T$. The solutions of all the mapping matrices are integrated into an overall objective function, which uses ${\ell}_{2,1}$-norm minimization constraints to improve discrimination ability of mapping matrices and reduce the impact of noise. After mapping matrices are computed, we adopt majority-voting strategy to classify the probe samples. To further learn the discrimination information between probe samples and obtain more robust mapping matrices, we also propose a multistage PSLR (MPSLR) algorithm, which iteratively updates the training dataset by adding those reliably labeled probe samples into it. The effectiveness of our approaches is evaluated using three public facial databases. Experimental results prove that our approaches are robust to illumination, expression and occlusion.