• Title/Summary/Keyword: Generalized Least Squares Problem

Search Result 26, Processing Time 0.032 seconds

Least Squares Approach for Structural Reanalysis

  • Kyung-Joon Cha;Ho-Jong Jang;Dal-Sun Yoon
    • Journal of the Korean Statistical Society
    • /
    • v.25 no.3
    • /
    • pp.369-379
    • /
    • 1996
  • A study is made of approximate technique for structural reanalysis based on the force method. Perturbntion analysis of generalized least squares problem is adopted to reanalyze a damaged structure, and related results are presented.

  • PDF

A Method of Obtaning Least Squares Estimators of Estimable Functions in Classification Linear Models

  • Kim, Byung-Hwee;Chang, In-Hong;Dong, Kyung-Hwa
    • Journal of the Korean Statistical Society
    • /
    • v.28 no.2
    • /
    • pp.183-193
    • /
    • 1999
  • In the problem of estimating estimable functions in classification linear models, we propose a method of obtaining least squares estimators of estimable functions. This method is based on the hierarchical Bayesian approach for estimating a vector of unknown parameters. Also, we verify that estimators obtained by our method are identical to least squares estimators of estimable functions obtained by using either generalized inverses or full rank reparametrization of the models. Some examples are given which illustrate our results.

  • PDF

An Algorithm for One-Sided Generalized Least Squares Estimation and Its Application

  • Park, Chul-Gyu
    • Journal of the Korean Statistical Society
    • /
    • v.29 no.3
    • /
    • pp.361-373
    • /
    • 2000
  • A simple and efficient algorithm is introduced for generalized least squares estimation under nonnegativity constraints in the components of the parameter vector. This algorithm gives the exact solution to the estimation problem within a finite number of pivot operations. Besides an illustrative example, an empirical study is conducted for investigating the performance of the proposed algorithm. This study indicates that most of problems are solved in a few iterations, and the number of iterations required for optimal solution increases linearly to the size of the problem. Finally, we will discuss the applicability of the proposed algorithm extensively to the estimation problem having a more general set of linear inequality constraints.

  • PDF

A Recursive Data Least Square Algorithm and Its Channel Equalization Application

  • Lim, Jun-Seok;Kim, Jae-Soo
    • The Journal of the Acoustical Society of Korea
    • /
    • v.25 no.2E
    • /
    • pp.43-48
    • /
    • 2006
  • Abstract-Using the recursive generalized eigendecomposition method, we develop a recursive form solution to the data least squares (DLS) problem, in which the error is assumed to lie in the data matrix only. Simulations demonstrate that DLS outperforms ordinary least square for certain types of deconvolution problems.

e-SVR using IRWLS Procedure

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1087-1094
    • /
    • 2005
  • e-insensitive support vector regression(e-SVR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the quadratic problem of e-SVR with a modified loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of e-SVR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for e-SVR.

  • PDF

EXTENSION OF FACTORING LIKELIHOOD APPROACH TO NON-MONOTONE MISSING DATA

  • Kim, Jae-Kwang
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.4
    • /
    • pp.401-410
    • /
    • 2004
  • We address the problem of parameter estimation in multivariate distributions under ignorable non-monotone missing data. The factoring likelihood method for monotone missing data, termed by Rubin (1974), is extended to a more general case of non-monotone missing data. The proposed method is algebraically equivalent to the Newton-Raphson method for the observed likelihood, but avoids the burden of computing the first and the second partial derivatives of the observed likelihood. Instead, the maximum likelihood estimates and their information matrices for each partition of the data set are computed separately and combined naturally using the generalized least squares method.

Generalized Bayes estimation for a SAR model with linear restrictions binding the coefficients

  • Chaturvedi, Anoop;Mishra, Sandeep
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.4
    • /
    • pp.315-327
    • /
    • 2021
  • The Spatial Autoregressive (SAR) models have drawn considerable attention in recent econometrics literature because of their capability to model the spatial spill overs in a feasible way. While considering the Bayesian analysis of these models, one may face the problem of lack of robustness with respect to underlying prior assumptions. The generalized Bayes estimators provide a viable alternative to incorporate prior belief and are more robust with respect to underlying prior assumptions. The present paper considers the SAR model with a set of linear restrictions binding the regression coefficients and derives restricted generalized Bayes estimator for the coefficients vector. The minimaxity of the restricted generalized Bayes estimator has been established. Using a simulation study, it has been demonstrated that the estimator dominates the restricted least squares as well as restricted Stein rule estimators.

Support vector expectile regression using IRWLS procedure

  • Choi, Kook-Lyeol;Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.931-939
    • /
    • 2014
  • In this paper we propose the iteratively reweighted least squares procedure to solve the quadratic programming problem of support vector expectile regression with an asymmetrically weighted squares loss function. The proposed procedure enables us to select the appropriate hyperparameters easily by using the generalized cross validation function. Through numerical studies on the artificial and the real data sets we show the effectiveness of the proposed method on the estimation performances.

THE (R,S)-SYMMETRIC SOLUTIONS TO THE LEAST-SQUARES PROBLEM OF MATRIX EQUATION AXB = C

  • Liang, Mao-Lin;Dai, Li-Fang;Wang, San-Fu
    • Journal of applied mathematics & informatics
    • /
    • v.27 no.5_6
    • /
    • pp.1061-1071
    • /
    • 2009
  • For real generalized reflexive matrices R, S, i.e., $R^T$ = R, $R^2$ = I, $S^T$ = S, $S^2$ = I, we say that real matrix X is (R,S)-symmetric, if RXS = X. In this paper, an iterative algorithm is proposed to solve the least-squares problem of matrix equation AXB = C with (R,S)-symmetric X. Furthermore, the optimal approximation solution to given matrix $X_0$ is also derived by this iterative algorithm. Finally, given numerical example and its convergent curve show that this method is feasible and efficient.

  • PDF

ON DIFFERENTIABILITY OF THE MATRIX TRACE OPERATOR AND ITS APPLICATIONS

  • Dulov, E.V.;Andrianova, N.A.
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.1
    • /
    • pp.97-109
    • /
    • 2001
  • This article is devoted to “forgotten” and rarely used technique of matrix analysis, introduced in 60-70th and enhanced by authors. We will study the matrix trace operator and it’s differentiability. This idea generalizes the notion of scalar derivative for matrix computations. The list of the most common derivatives is given at the end of the article. Additionally we point out a close connection of this technique with a least square problem in it’s classical and generalized case.