• Title/Summary/Keyword: least-squares problem

Search Result 347, Processing Time 0.031 seconds

On a Robust Subset Selection Procedure for the Slopes of Regression Equations

  • Song, Moon-Sup;Oh, Chang-Hyuck
    • Journal of the Korean Statistical Society
    • /
    • v.10
    • /
    • pp.105-121
    • /
    • 1981
  • The problem of selection of a subset containing the largest of several slope parameters of regression equations is considered. The proposed selection procedure is based on the weighted median estimators for regression parameters and the median of rescaled absolute residuals for scale parameters. Those estimators are compared with the classical least squares estimators by a simulation study. A Monte Carlo comparison is also made between the new procedure based on the weighted median estiamtors and the procedure based on the least squares estimators. The results show that the proposed procedure is quite robust with respect to the heaviness of distribution tails.

  • PDF

Heat Transfer Analysis of Composite Materials Using MLS Finite Difference Method (MLS 유한차분법을 이용한 복합재료의 열전달문제 해석)

  • Yoon, Young-Cheol
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2008.04a
    • /
    • pp.2-7
    • /
    • 2008
  • A highly efficient moving least squares finite difference method (MLS FDM) for heat transfer analysis of composite material with interface. In the MLS FDM, governing differential equations are directly discretized at each node. No grid structure is required in the solution procedure. The discretization of governing equations are done by Taylor expansion based on moving least squares method. A wedge function is designed for the modeling of the derivative jump across the interface. Numerical examples showed that the numerical scheme shows very good computational efficiency together with high aocuracy so that the scheme for heat transfer problem with different heat conductivities was successfully verified.

  • PDF

On Line LS-SVM for Classification

  • Kim, Daehak;Oh, KwangSik;Shim, Jooyong
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.2
    • /
    • pp.595-601
    • /
    • 2003
  • In this paper we propose an on line training method for classification based on least squares support vector machine. Proposed method enables the computation cost to be reduced and the training to be peformed incrementally, With the incremental formulation of an inverse matrix in optimization problem, current information and new input data can be used for building the new inverse matrix for the estimation of the optimal bias and Lagrange multipliers, so the large scale matrix inversion operation can be avoided. Numerical examples are included which indicate the performance of proposed algorithm.

Dimensional Analysis for the Front Chassis Module in the Auto Industry (자동차 프런트 샤시 모듈의 좌표 해석)

  • 이동목;양승한
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.21 no.8
    • /
    • pp.50-56
    • /
    • 2004
  • The directional ability of an automobile has an influence on driver directly, and hence it must be given most priority. Alignment factors of automobile such as the camber, caster and toe directly affect the directional ability of a vehicle. The above mentioned factors are determined by the pose of interlinks in the assembly of an automobile front chassis module. Measuring the position of center point of ball joints in the front lower arm is very difficult. A method to determine this position is suggested in this paper. Pose estimation for front chassis module and dimensional evaluation to find the rotational characteristics of front lower arm were developed based on fundamental geometric techniques. To interpret the inspection data obtained for front chassis module, 3-D best fit method is needed. The best fit method determines the relationship between the nominal design coordinate system and the corresponding feature coordinate system. The least squares method based on singular value decomposition is used in this paper.

Partitioning likelihood method in the analysis of non-monotone missing data

  • Kim Jae-Kwang
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.1-8
    • /
    • 2004
  • We address the problem of parameter estimation in multivariate distributions under ignorable non-monotone missing data. The factoring likelihood method for monotone missing data, termed by Robin (1974), is extended to a more general case of non-monotone missing data. The proposed method is algebraically equivalent to the Newton-Raphson method for the observed likelihood, but avoids the burden of computing the first and the second partial derivatives of the observed likelihood Instead, the maximum likelihood estimates and their information matrices for each partition of the data set are computed separately and combined naturally using the generalized least squares method. A numerical example is also presented to illustrate the method.

  • PDF

Face recognition by PLS

  • Baek, Jang-Sun
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.69-72
    • /
    • 2003
  • The paper considers partial least squares (PLS) as a new dimension reduction technique for the feature vector to overcome the small sample size problem in face recognition. Principal component analysis (PCA), a conventional dimension reduction method, selects the components with maximum variability, irrespective of the class information. So PCA does not necessarily extract features that are important for the discrimination of classes. PLS, on the other hand, constructs the components so that the correlation between the class variable and themselves is maximized. Therefore PLS components are more predictive than PCA components in classification. The experimental results on Manchester and ORL databases show that PLS is to be preferred over PCA when classification is the goal and dimension reduction is needed.

  • PDF

GACV for partially linear support vector regression

  • Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.2
    • /
    • pp.391-399
    • /
    • 2013
  • Partially linear regression is capable of providing more complete description of the linear and nonlinear relationships among random variables. In support vector regression (SVR) the hyper-parameters are known to affect the performance of regression. In this paper we propose an iterative reweighted least squares (IRWLS) procedure to solve the quadratic problem of partially linear support vector regression with a modified loss function, which enables us to use the generalized approximate cross validation function to select the hyper-parameters. Experimental results are then presented which illustrate the performance of the partially linear SVR using IRWLS procedure.

Linear Prediction Approach for Accurate Dual-Channel Sine-Wave Parameter Estimation in White Gaussian Noise

  • So, Hing-Cheung;Zhou, Zhenhua
    • ETRI Journal
    • /
    • v.34 no.4
    • /
    • pp.641-644
    • /
    • 2012
  • The problem of sinusoidal parameter estimation at two channels with common frequency in white Gaussian noise is addressed. By making use of the linear prediction property, an iterative linear least squares (LLS) algorithm for accurate frequency estimation is devised. The remaining parameters are then determined according to the LLS fit with the use of the frequency estimate. It is proven that the variance of the frequency estimate achieves Cram$\acute{e}$r-Rao lower bound at sufficiently small noise conditions.

Indentification of continuous systems in the presence of input-output measurement noises

  • Yang, Zi-Jiang;Sagara, Setsuo;Wada, Kiyoshi
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1990.10b
    • /
    • pp.1222-1227
    • /
    • 1990
  • The problem of identification of continuous systems is considered when both the discrete input and output measurements are contaminated by white noises. Using a predesigned digital low-pass filter, a discrete-time estimation model is constructed easily without direct approximations of system signal derivatives from sampled data. If the pass-band of the filter is designed so that it includes the main frequencies of both the system input and output signals in some range, the noise effects are sufficiently reduced, accurate estimates can be obtained by least squares(LS) algorithm in the presence of low measurement noises. Two classes of filters(infinite impulse response(IIR) filter and finite impulse response(FIR) filter) are employed. The former requires less computational burden and memory than the latter while the latter is suitable for the bias compensated least squares(BCLS) method, which compensates the bias of the LS estimate by the estimates of the input-output noise variances and thus yields unbiased estimates in the presence of high noises.

  • PDF

Effect of Dimension Reduction on Prediction Performance of Multivariate Nonlinear Time Series

  • Jeong, Jun-Yong;Kim, Jun-Seong;Jun, Chi-Hyuck
    • Industrial Engineering and Management Systems
    • /
    • v.14 no.3
    • /
    • pp.312-317
    • /
    • 2015
  • The dynamic system approach in time series has been used in many real problems. Based on Taken's embedding theorem, we can build the predictive function where input is the time delay coordinates vector which consists of the lagged values of the observed series and output is the future values of the observed series. Although the time delay coordinates vector from multivariate time series brings more information than the one from univariate time series, it can exhibit statistical redundancy which disturbs the performance of the prediction function. We apply dimension reduction techniques to solve this problem and analyze the effect of this approach for prediction. Our experiment uses delayed Lorenz series; least squares support vector regression approximates the predictive function. The result shows that linearly preserving projection improves the prediction performance.