• 제목/요약/키워드: Least-squares Regression

검색결과 570건 처리시간 0.024초

Test of the Hypothesis based on Nonlinear Regression Quantiles Estimators

  • Choi, Seung-Hoe
    • Journal of the Korean Data and Information Science Society
    • /
    • 제14권2호
    • /
    • pp.153-165
    • /
    • 2003
  • This paper considers the likelihood ratio test statistic based on nonlinear regression quantiles estimators in order to test of hypothesis about the regression parameter $\theta_o$ and derives asymptotic distribution of proposed test statistic under the null hypothesis and a sequence of local alternative hypothesis. The paper also investigates asymptotic relative efficiency of the proposed test to the test based on the least squares estimators or the least absolute deviation estimators and gives some examples to illustrate the application of the main result.

  • PDF

A New Deletion Criterion of Principal Components Regression with Orientations of the Parameters

  • Lee, Won-Woo
    • Journal of the Korean Statistical Society
    • /
    • 제16권2호
    • /
    • pp.55-70
    • /
    • 1987
  • The principal components regression is one of the substitues for least squares method when there exists multicollinearity in the multiple linear regression model. It is observed graphically that the performance of the principal components regression is strongly dependent upon the values of the parameters. Accordingly, a new deletion criterion which determines proper principal components to be deleted from the analysis is developed and its usefulness is checked by simulations.

  • PDF

한강유역의 중소하천에 대한 계획하폭 산정 (Determination of Design Width for Medium Streams in the Han River Basin)

  • 전세진;안태진;박정응
    • 한국수자원학회논문집
    • /
    • 제31권6호
    • /
    • pp.675-684
    • /
    • 1998
  • 본 연구는 한강유역 중소하천 계획하폭 산정공식을 결정하기 위하여 216개 구간의 중소하천에서의 계획홍수량, 유역면적, 하상경사, 실제하폭을 수집한 후, 1) 최소자승법(least squares, LS), 2) 최소중간치자승법(least median squares, LMS) 및 3) 재가중최소자승법(reweighted least squares, RLS)을 이용하여 경험적인 계획 하폭 공식을 결정하였다. 한강유역에서의 기존하폭 산정공식과 비교하기 위하여 계획하폭 산정공식의 형식은 6가지 형으로 고려하였다. 기존하폭공식과 6가지 형의 공식을 평가하기 위하여 평균제곱근오차, 절대평균오차 및 평균오차를 계산하여 비교 검토한 결과, 하폭공식의 형식으로는 본 연구의 하폭-계획홍수량-하상경사로 표현된 공식이 적합한 것으로 나타났다. 본 연구에서 추정된 계획하폭 산정공식은 한강유역 중소하천 설계시 계획하폭 결정의 지표로 적용될 수 있으리라 기대된다.

  • PDF

Pitfalls in the Application of the COTE in a Linear Regression Model with Seasonal Data

  • Seuck Heun Song;YouSung Park
    • Communications for Statistical Applications and Methods
    • /
    • 제4권2호
    • /
    • pp.353-358
    • /
    • 1997
  • When the disturbances in the linear repression medel are generated by a seasonal autoregressive scheme the Cochrane Orcutt transformation estimator (COTE) is a well known alternative to Generalized Least Squares estimator (GLSE). In this paper it is analyzed in which situation the Ordinary Least Squares estimator (OLSE) is always better than COTE for positive autocorrelation in terms of efficiency which is here defined as the ratio of the total variances.

  • PDF

A Robust Estimation Procedure for the Linear Regression Model

  • Kim, Bu-Yong
    • Journal of the Korean Statistical Society
    • /
    • 제16권2호
    • /
    • pp.80-91
    • /
    • 1987
  • Minimum $L_i$ norm estimation is a robust procedure ins the sense that it leads to an estimator which has greater statistical eficiency than the least squares estimator in the presence of outliers. And the $L_1$ norm estimator has some desirable statistical properties. In this paper a new computational procedure for $L_1$ norm estimation is proposed which combines the idea of reweighted least squares method and the linear programming approach. A modification of the projective transformation method is employed to solve the linear programming problem instead of the simplex method. It is proved that the proposed algorithm terminates in a finite number of iterations.

  • PDF

On a Robust Subset Selection Procedure for the Slopes of Regression Equations

  • Song, Moon-Sup;Oh, Chang-Hyuck
    • Journal of the Korean Statistical Society
    • /
    • 제10권
    • /
    • pp.105-121
    • /
    • 1981
  • The problem of selection of a subset containing the largest of several slope parameters of regression equations is considered. The proposed selection procedure is based on the weighted median estimators for regression parameters and the median of rescaled absolute residuals for scale parameters. Those estimators are compared with the classical least squares estimators by a simulation study. A Monte Carlo comparison is also made between the new procedure based on the weighted median estiamtors and the procedure based on the least squares estimators. The results show that the proposed procedure is quite robust with respect to the heaviness of distribution tails.

  • PDF

Deep LS-SVM for regression

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제27권3호
    • /
    • pp.827-833
    • /
    • 2016
  • In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.

Robust varying coefficient model using L1 regularization

  • Hwang, Changha;Bae, Jongsik;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제27권4호
    • /
    • pp.1059-1066
    • /
    • 2016
  • In this paper we propose a robust version of varying coefficient models, which is based on the regularized regression with L1 regularization. We use the iteratively reweighted least squares procedure to solve L1 regularized objective function of varying coefficient model in locally weighted regression form. It provides the efficient computation of coefficient function estimates and the variable selection for given value of smoothing variable. We present the generalized cross validation function and Akaike information type criterion for the model selection. Applications of the proposed model are illustrated through the artificial examples and the real example of predicting the effect of the input variables and the smoothing variable on the output.

Sparse Kernel Regression using IRWLS Procedure

  • Park, Hye-Jung
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권3호
    • /
    • pp.735-744
    • /
    • 2007
  • Support vector machine(SVM) is capable of providing a more complete description of the linear and nonlinear relationships among random variables. In this paper we propose a sparse kernel regression(SKR) to overcome a weak point of SVM, which is, the steep growth of the number of support vectors with increasing the number of training data. The iterative reweighted least squares(IRWLS) procedure is used to solve the optimal problem of SKR with a Laplacian prior. Furthermore, the generalized cross validation(GCV) function is introduced to select the hyper-parameters which affect the performance of SKR. Experimental results are then presented which illustrate the performance of the proposed procedure.

  • PDF

Support Vector Quantile Regression with Weighted Quadratic Loss Function

  • Shim, Joo-Yong;Hwang, Chang-Ha
    • Communications for Statistical Applications and Methods
    • /
    • 제17권2호
    • /
    • pp.183-191
    • /
    • 2010
  • Support vector quantile regression(SVQR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the problem of SVQR with a weighted quadratic loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of SVQR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for SVQR.