• Title/Summary/Keyword: Least Squares Support Vector Regression

Search Result 46, Processing Time 0.026 seconds

Influencing factors and prediction of carbon dioxide emissions using factor analysis and optimized least squares support vector machine

  • Wei, Siwei;Wang, Ting;Li, Yanbin
    • Environmental Engineering Research
    • /
    • v.22 no.2
    • /
    • pp.175-185
    • /
    • 2017
  • As the energy and environmental problems are increasingly severe, researches about carbon dioxide emissions has aroused widespread concern. The accurate prediction of carbon dioxide emissions is essential for carbon emissions controlling. In this paper, we analyze the relationship between carbon dioxide emissions and influencing factors in a comprehensive way through correlation analysis and regression analysis, achieving the effective screening of key factors from 16 preliminary selected factors including GDP, total population, total energy consumption, power generation, steel production coal consumption, private owned automobile quantity, etc. Then fruit fly algorithm is used to optimize the parameters of least squares support vector machine. And the optimized model is used for prediction, overcoming the blindness of parameter selection in least squares support vector machine and maximizing the training speed and global searching ability accordingly. The results show that the prediction accuracy of carbon dioxide emissions is improved effectively. Besides, we conclude economic and environmental policy implications on the basis of analysis and calculation.

Support Vector Quantile Regression with Weighted Quadratic Loss Function

  • Shim, Joo-Yong;Hwang, Chang-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.2
    • /
    • pp.183-191
    • /
    • 2010
  • Support vector quantile regression(SVQR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the problem of SVQR with a weighted quadratic loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of SVQR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for SVQR.

Switching Regression Analysis via Fuzzy LS-SVM

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.2
    • /
    • pp.609-617
    • /
    • 2006
  • A new fuzzy c-regression algorithm for switching regression analysis is presented, which combines fuzzy c-means clustering and least squares support vector machine. This algorithm can detect outliers in switching regression models while yielding the simultaneous estimates of the associated parameters together with a fuzzy c-partitions of data. It can be employed for the model-free nonlinear regression which does not assume the underlying form of the regression function. We illustrate the new approach with some numerical examples that show how it can be used to fit switching regression models to almost all types of mixed data.

  • PDF

Weighted Support Vector Machines for Heteroscedastic Regression

  • Park, Hye-Jung;Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.2
    • /
    • pp.467-474
    • /
    • 2006
  • In this paper we present a weighted support vector machine(SVM) and a weighted least squares support vector machine(LS-SVM) for the prediction in the heteroscedastic regression model. By adding weights to standard SVM and LS-SVM the better fitting ability can be achieved when errors are heteroscedastic. In the numerical studies, we illustrate the prediction performance of the proposed procedure by comparing with the procedure which combines standard SVM and LS-SVM and wild bootstrap for the prediction.

  • PDF

Censored varying coefficient regression model using Buckley-James method

  • Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.1167-1177
    • /
    • 2017
  • The censored regression using the pseudo-response variable proposed by Buckley and James has been one of the most well-known models. Recently, the varying coefficient regression model has received a great deal of attention as an important tool for modeling. In this paper we propose a censored varying coefficient regression model using Buckley-James method to consider situations where the regression coefficients of the model are not constant but change as the smoothing variables change. By using the formulation of least squares support vector machine (LS-SVM), the coefficient estimators of the proposed model can be easily obtained from simple linear equations. Furthermore, a generalized cross validation function can be easily derived. In this paper, we evaluated the proposed method and demonstrated the adequacy through simulate data sets and real data sets.

Effect of Dimension Reduction on Prediction Performance of Multivariate Nonlinear Time Series

  • Jeong, Jun-Yong;Kim, Jun-Seong;Jun, Chi-Hyuck
    • Industrial Engineering and Management Systems
    • /
    • v.14 no.3
    • /
    • pp.312-317
    • /
    • 2015
  • The dynamic system approach in time series has been used in many real problems. Based on Taken's embedding theorem, we can build the predictive function where input is the time delay coordinates vector which consists of the lagged values of the observed series and output is the future values of the observed series. Although the time delay coordinates vector from multivariate time series brings more information than the one from univariate time series, it can exhibit statistical redundancy which disturbs the performance of the prediction function. We apply dimension reduction techniques to solve this problem and analyze the effect of this approach for prediction. Our experiment uses delayed Lorenz series; least squares support vector regression approximates the predictive function. The result shows that linearly preserving projection improves the prediction performance.

Mixed effects least squares support vector machine for survival data analysis (생존자료분석을 위한 혼합효과 최소제곱 서포트벡터기계)

  • Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.4
    • /
    • pp.739-748
    • /
    • 2012
  • In this paper we propose a mixed effects least squares support vector machine (LS-SVM) for the censored data which are observed from different groups. We use weights by which the randomly right censoring is taken into account in the nonlinear regression. The weights are formed with Kaplan-Meier estimates of censoring distribution. In the proposed model a random effects term representing inter-group variation is included. Furthermore generalized cross validation function is proposed for the selection of the optimal values of hyper-parameters. Experimental results are then presented which indicate the performance of the proposed LS-SVM by comparing with a standard LS-SVM for the censored data.

Weighted LS-SVM Regression for Right Censored Data

  • Kim, Dae-Hak;Jeong, Hyeong-Chul
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.765-776
    • /
    • 2006
  • In this paper we propose an estimation method on the regression model with randomly censored observations of the training data set. The weighted least squares support vector machine regression is applied for the regression function estimation by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed estimation method.

Deep LS-SVM for regression

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.3
    • /
    • pp.827-833
    • /
    • 2016
  • In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.

Estimation of nonlinear GARCH-M model (비선형 평균 일반화 이분산 자기회귀모형의 추정)

  • Shim, Joo-Yong;Lee, Jang-Taek
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.5
    • /
    • pp.831-839
    • /
    • 2010
  • Least squares support vector machine (LS-SVM) is a kernel trick gaining a lot of popularities in the regression and classification problems. We use LS-SVM to propose a iterative algorithm for a nonlinear generalized autoregressive conditional heteroscedasticity model in the mean (GARCH-M) model to estimate the mean and the conditional volatility of stock market returns. The proposed method combines a weighted LS-SVM for the mean and unweighted LS-SVM for the conditional volatility. In this paper, we show that nonlinear GARCH-M models have a higher performance than the linear GARCH model and the linear GARCH-M model via real data estimations.