• Title/Summary/Keyword: statistical analysis method

Search Result 5,035, Processing Time 0.03 seconds

Arrow Diagrams for Kernel Principal Component Analysis

  • Huh, Myung-Hoe
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.3
    • /
    • pp.175-184
    • /
    • 2013
  • Kernel principal component analysis(PCA) maps observations in nonlinear feature space to a reduced dimensional plane of principal components. We do not need to specify the feature space explicitly because the procedure uses the kernel trick. In this paper, we propose a graphical scheme to represent variables in the kernel principal component analysis. In addition, we propose an index for individual variables to measure the importance in the principal component plane.

A Stochastic Nonlinear Analysis of Daily Runoff Discharge Using Artificial Intelligence Technique (인공지능기법을 이용한 일유출량의 추계학적 비선형해석)

  • 안승섭;김성원
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.39 no.6
    • /
    • pp.54-66
    • /
    • 1997
  • The objectives of this study is to introduce and apply neural network theory to real hydrologic systems for stochastic nonlinear predicting of daily runoff discharge in the river catchment. Back propagation algorithm of neural network model is applied for the estimation of daily stochastic runoff discharge using historical daily rainfall and observed runoff discharge. For the fitness and efficiency analysis of models, the statistical analysis is carried out between observed discharge and predicted discharge in the chosen runoff periods. As the result of statistical analysis, method 3 which has much processing elements of input layer is more prominent model than other models(method 1, method 2) in this study.Therefore, on the basis of this study, further research activities are needed for the development of neural network algorithm for the flood prediction including real-time forecasting and for the optimal operation system of dams and so forth.

  • PDF

Numerical and statistical analysis of permeability of concrete as a random heterogeneous composite

  • Zhou, Chunsheng;Li, Kefei
    • Computers and Concrete
    • /
    • v.7 no.5
    • /
    • pp.469-482
    • /
    • 2010
  • This paper investigates the concrete permeability through a numerical and statistical approach. Concrete is considered as a random heterogeneous composite of three phases: aggregates, interfacial transition zones (ITZ) and matrix. The paper begins with some classical bound and estimate theories applied to concrete permeability and the influence of ITZ on these bound and estimate values is discussed. Numerical samples for permeability analysis are established through random aggregate structure (RAS) scheme, each numerical sample containing randomly distributed aggregates coated with ITZ and dispersed in a homogeneous matrix. The volumetric fraction of aggregates is fixed and the size distribution of aggregates observes Fuller's curve. Then finite element method is used to solve the steady permeation problem on 2D numerical samples and the overall permeability is deduced from flux-pressure relation. The impact of ITZ on overall permeability is analyzed in terms of ITZ width and contrast ratio between ITZ and matrix permeabilities. Hereafter, 3680 samples are generated for 23 sample sizes and 4 contrast ratios, and statistical analysis is performed on the permeability dispersion in terms of sample size and ITZ characteristics. By sample theory, the size of representative volume element (RVE) for permeability is then quantified considering sample realization number and expected error. Concluding remarks are provided for the impact of ITZ on concrete permeability and its statistical characteristics.

Optimization and investigations of low-velocity bending impact of thin-walled beams

  • Hossein Taghipoor;Mahdi Sefidi
    • Steel and Composite Structures
    • /
    • v.50 no.2
    • /
    • pp.159-181
    • /
    • 2024
  • In the present study, the effect of geometrical parameters of two different types of aluminum thin-walled structures on energy absorption under three-bending impact loading has been investigated experimentally and numerically. To evaluate the effect of parameters on the specific energy absorption (SEA), initial peak crushing force (IPCF), and the maximum crushing distance (δ), a design of experiment technique (DOE) with response surface method (RSM) was applied. Four different thin-walled structures have been tested under the low-velocity impact, and then they have simulated by ABAQUS software. An acceptable consistency between the numerical and experimental results was obtained. In this study, statistical analysis has been performed on various parameters of three different types of tubes. In the first and the second statistical analysis, the dimensional parameters of the cross-section, the number of holes, and the dimensional parameter of holes were considered as the design variables. The diameter reduction rate and the number of sections with different diameters are related to the third statistical analysis. All design points of the statistical method have been simulated by the finite element package, ABAQUS/Explicit. The final result shows that the height and thickness of tubes were more effective than other geometrical parameters, and despite the fact that the deformations of the cylindrical tubes were around forty percent greater than the rectangular tubes, the top desirability was relevant to the cylindrical tubes with reduced cross-sections.

Improving the Gumbel analysis by using M-th highest extremes

  • Cook, Nicholas J.
    • Wind and Structures
    • /
    • v.1 no.1
    • /
    • pp.25-42
    • /
    • 1998
  • Improvements to the Gumbel method of extreme value analysis of wind data made over the last two decades are reviewed and illustrated using sample data for Jersey. A new procedure for extending the Gumbel method to include M-th highest annual extremes is shown to be less effective than the standard method, but leads to a method for calibrating peak-over-threshold methods against the standard Gumbel approach. Peak-over-threshold methods that include at least the 3rd highest annual extremes, specifically the modified Jensen and Franck method and the "Method of independent storms" are shown to give the best estimates of extremes from observations.

Two-stage imputation method to handle missing data for categorical response variable

  • Jong-Min Kim;Kee-Jae Lee;Seung-Joo Lee
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.6
    • /
    • pp.577-587
    • /
    • 2023
  • Conventional categorical data imputation techniques, such as mode imputation, often encounter issues related to overestimation. If the variable has too many categories, multinomial logistic regression imputation method may be impossible due to computational limitations. To rectify these limitations, we propose a two-stage imputation method. During the first stage, we utilize the Boruta variable selection method on the complete dataset to identify significant variables for the target categorical variable. Then, in the second stage, we use the important variables for the target categorical variable for logistic regression to impute missing data in binary variables, polytomous regression to impute missing data in categorical variables, and predictive mean matching to impute missing data in quantitative variables. Through analysis of both asymmetric and non-normal simulated and real data, we demonstrate that the two-stage imputation method outperforms imputation methods lacking variable selection, as evidenced by accuracy measures. During the analysis of real survey data, we also demonstrate that our suggested two-stage imputation method surpasses the current imputation approach in terms of accuracy.

Cubic normal distribution and its significance in structural reliability

  • Zhao, Yan-Gang;Lu, Zhao-Hui
    • Structural Engineering and Mechanics
    • /
    • v.28 no.3
    • /
    • pp.263-280
    • /
    • 2008
  • Information on the distribution of the basic random variable is essential for the accurate analysis of structural reliability. The usual method for determining the distributions is to fit a candidate distribution to the histogram of available statistical data of the variable and perform approximate goodness-of-fit tests. Generally, such candidate distribution would have parameters that may be evaluated from the statistical moments of the statistical data. In the present paper, a cubic normal distribution, whose parameters are determined using the first four moments of available sample data, is investigated. A parameter table based on the first four moments, which simplifies parameter estimation, is given. The simplicity, generality, flexibility and advantages of this distribution in statistical data analysis and its significance in structural reliability evaluation are discussed. Numerical examples are presented to demonstrate these advantages.

Residuals Plots for Repeated Measures Data

  • PARK TAESUNG
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2000.11a
    • /
    • pp.187-191
    • /
    • 2000
  • In the analysis of repeated measurements, multivariate regression models that account for the correlations among the observations from the same subject are widely used. Like the usual univariate regression models, these multivariate regression models also need some model diagnostic procedures. In this paper, we propose a simple graphical method to detect outliers and to investigate the goodness of model fit in repeated measures data. The graphical method is based on the quantile-quantile(Q-Q) plots of the $X^2$ distribution and the standard normal distribution. We also propose diagnostic measures to detect influential observations. The proposed method is illustrated using two examples.

  • PDF

Empirical Study for the Technological Forecasting using Delphi Method

  • Kim, Yon-Hyong
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.2
    • /
    • pp.425-434
    • /
    • 2002
  • In this paper, we evaluated the technological forecasting based on questionnaires of experts working in internet-banking industry. We prepared questionnaires on the 13 items. We examined specialties of respondents, relative importance of research contents, expected time of realization, likelihood of conviction on the expected time of realization, and their opinions on the levels of domestic's research and development comparing with advanced standards on each item. And we made various analysis based on data collected from Delphi method.

Least Squares Estimation with Autocorrelated Residuals : A Survey

  • Rhee, Hak-Yong
    • Journal of the Korean Statistical Society
    • /
    • v.4 no.1
    • /
    • pp.39-56
    • /
    • 1975
  • Ever since Gauss discussed the least-squares method in 1812 and Bertrand translated Gauss's work in French, the least-squares method has been used for various economic analysis. The justification of the least-squares method was given by Markov in 1912 in connection with the previous discussion by Gauss and Bertrand. The main argument concerned the problem of obtaining the best linear unbiased estimates. In some modern language, the argument can be explained as follow.

  • PDF