• Title/Summary/Keyword: Multivariate Regression

Search Result 1,436, Processing Time 0.034 seconds

A multivariate adaptive regression splines model for estimation of maximum wall deflections induced by braced excavation

  • Xiang, Yuzhou;Goh, Anthony Teck Chee;Zhang, Wengang;Zhang, Runhong
    • Geomechanics and Engineering
    • /
    • v.14 no.4
    • /
    • pp.315-324
    • /
    • 2018
  • With rapid economic growth, numerous deep excavation projects for high-rise buildings and subway transportation networks have been constructed in the past two decades. Deep excavations particularly in thick deposits of soft clay may cause excessive ground movements and thus result in potential damage to adjacent buildings and supporting utilities. Extensive plane strain finite element analyses considering small strain effect have been carried out to examine the wall deflections for excavations in soft clay deposits supported by diaphragm walls and bracings. The excavation geometrical parameters, soil strength and stiffness properties, soil unit weight, the strut stiffness and wall stiffness were varied to study the wall deflection behaviour. Based on these results, a multivariate adaptive regression splines model was developed for estimating the maximum wall deflection. Parametric analyses were also performed to investigate the influence of the various design variables on wall deflections.

Application of Multivariate Adaptive Regression Spline-Assisted Objective Function on Optimization of Heat Transfer Rate Around a Cylinder

  • Dey, Prasenjit;Das, Ajoy K.
    • Nuclear Engineering and Technology
    • /
    • v.48 no.6
    • /
    • pp.1315-1320
    • /
    • 2016
  • The present study aims to predict the heat transfer characteristics around a square cylinder with different corner radii using multivariate adaptive regression splines (MARS). Further, the MARS-generated objective function is optimized by particle swarm optimization. The data for the prediction are taken from the recently published article by the present authors [P. Dey, A. Sarkar, A.K. Das, Development of GEP and ANN model to predict the unsteady forced convection over a cylinder, Neural Comput. Appl. (2015) 1-13]. Further, the MARS model is compared with artificial neural network and gene expression programming. It has been found that the MARS model is very efficient in predicting the heat transfer characteristics. It has also been found that MARS is more efficient than artificial neural network and gene expression programming in predicting the forced convection data, and also particle swarm optimization can efficiently optimize the heat transfer rate.

Applications of response dimension reduction in large p-small n problems

  • Minjee Kim;Jae Keun Yoo
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.2
    • /
    • pp.191-202
    • /
    • 2024
  • The goal of this paper is to show how multivariate regression analysis with high-dimensional responses is facilitated by the response dimension reduction. Multivariate regression, characterized by multi-dimensional response variables, is increasingly prevalent across diverse fields such as repeated measures, longitudinal studies, and functional data analysis. One of the key challenges in analyzing such data is managing the response dimensions, which can complicate the analysis due to an exponential increase in the number of parameters. Although response dimension reduction methods are developed, there is no practically useful illustration for various types of data such as so-called large p-small n data. This paper aims to fill this gap by showcasing how response dimension reduction can enhance the analysis of high-dimensional response data, thereby providing significant assistance to statistical practitioners and contributing to advancements in multiple scientific domains.

Fused inverse regression with multi-dimensional responses

  • Cho, Youyoung;Han, Hyoseon;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.3
    • /
    • pp.267-279
    • /
    • 2021
  • A regression with multi-dimensional responses is quite common nowadays in the so-called big data era. In such regression, to relieve the curse of dimension due to high-dimension of responses, the dimension reduction of predictors is essential in analysis. Sufficient dimension reduction provides effective tools for the reduction, but there are few sufficient dimension reduction methodologies for multivariate regression. To fill this gap, we newly propose two fused slice-based inverse regression methods. The proposed approaches are robust to the numbers of clusters or slices and improve the estimation results over existing methods by fusing many kernel matrices. Numerical studies are presented and are compared with existing methods. Real data analysis confirms practical usefulness of the proposed methods.

A study on the multivariate sliced inverse regression (다변량 분할 역회귀모형에 관한 연구)

  • 이용구;이덕기
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.2
    • /
    • pp.293-308
    • /
    • 1997
  • Sliced inverse regression is a method for reducing the dimension of the explanatory variable X without going through any parametric or nonparametric model fitting process. This method explores the simplicity of the inverse view of regression; that is, instead of regressing the univariate output varable y against the multivariate X, we regress X against y. In this article, we propose bivariate sliced inverse regression, whose method regress the multivariate X against the bivariate output variables $y_1, Y_2$. Bivariate sliced inverse regression estimates the e.d.r. directions of satisfying two generalized regression model simultaneously. For the application of bivariate sliced inverse regression, we decompose the output variable y into two variables, one variable y gained by projecting the output variable y onto the column space of X and the other variable r through projecting the output variable y onto the space orthogonal to the column space of X, respectively and then estimate the e.d.r. directions of the generalized regression model by utilize two variables simultaneously. As a result, bivariate sliced inverse regression of considering the variable y and r simultaneously estimates the e.d.r. directions efficiently and steadily when the regression model is linear, quadratic and nonlinear, respectively.

  • PDF

Multivariate Analysis for Clinicians (임상의를 위한 다변량 분석의 실제)

  • Oh, Joo Han;Chung, Seok Won
    • Clinics in Shoulder and Elbow
    • /
    • v.16 no.1
    • /
    • pp.63-72
    • /
    • 2013
  • In medical research, multivariate analysis, especially multiple regression analysis, is used to analyze the influence of multiple variables on the result. Multiple regression analysis should include variables in the model and the problem of multi-collinearity as there are many variables as well as the basic assumption of regression analysis. The multiple regression model is expressed as the coefficient of determination, $R^2$ and the influence of independent variables on result as a regression coefficient, ${\beta}$. Multiple regression analysis can be divided into multiple linear regression analysis, multiple logistic regression analysis, and Cox regression analysis according to the type of dependent variables (continuous variable, categorical variable (binary logit), and state variable, respectively), and the influence of variables on the result is evaluated by regression coefficient${\beta}$, odds ratio, and hazard ratio, respectively. The knowledge of multivariate analysis enables clinicians to analyze the result accurately and to design the further research efficiently.

Principal selected response reduction in multivariate regression (다변량회귀에서 주선택 반응변수 차원축소)

  • Yoo, Jae Keun
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.4
    • /
    • pp.659-669
    • /
    • 2021
  • Multivariate regression often appears in longitudinal or functional data analysis. Since multivariate regression involves multi-dimensional response variables, it is more strongly affected by the so-called curse of dimension that univariate regression. To overcome this issue, Yoo (2018) and Yoo (2019a) proposed three model-based response dimension reduction methodologies. According to various numerical studies in Yoo (2019a), the default method suggested in Yoo (2019a) is least sensitive to the simulated models, but it is not the best one. To release this issue, the paper proposes an selection algorithm by comparing the other two methods with the default one. This approach is called principal selected response reduction. Various simulation studies show that the proposed method provides more accurate estimation results than the default one by Yoo (2019a), and it confirms practical and empirical usefulness of the propose method over the default one by Yoo (2019a).

Development of Discriminant Analysis System by Graphical User Interface of Visual Basic

  • Lee, Yong-Kyun;Shin, Young-Jae;Cha, Kyung-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.2
    • /
    • pp.447-456
    • /
    • 2007
  • Recently, the multivariate statistical analysis has been used to analyze meaningful information for various data. In this paper, we develope the multivariate statistical analysis system combined with Fisher discriminant analysis, logistic regression, neural network, and decision tree using visual basic 6.0.

  • PDF

Admissible Hierarchical Bayes Estimators of a Multivariate Normal Mean Shrinking towards a Regression Surface

  • Cho, Byung-Yup;Choi, Kuey-Chung;Chang, In-Hong
    • Communications for Statistical Applications and Methods
    • /
    • v.3 no.2
    • /
    • pp.205-216
    • /
    • 1996
  • Consider the problem of estimating a multivariate normal mean with an unknown covarience matrix under a weighted sum of squared error losses. We first provide hierarchical Bayes estimators which shrink the usual (maximum liklihood, uniformly minimum variance unbiased) estimator towards a regression surface and then prove the admissibility of these estimators using Blyth's (1951) method.

  • PDF