• Title/Summary/Keyword: Stochastic order

Search Result 581, Processing Time 0.026 seconds

Iterative-R: A reliability-based calibration framework of response modification factor for steel frames

  • Soleimani-Babakamali, Mohammad Hesam;Nasrollahzadeh, Kourosh;Moghadam, Amin
    • Steel and Composite Structures
    • /
    • v.42 no.1
    • /
    • pp.59-74
    • /
    • 2022
  • This study introduces a general reliability-based, performance-based design framework to design frames regarding their uncertainties and user-defined design goals. The Iterative-R method extracted from the main framework can designate a proper R (i.e., response modification factor) satisfying the design goal regarding target reliability index and pre-defined probability of collapse. The proposed methodology is based on FEMA P-695 and can be used for all systems that FEMA P-695 applies. To exemplify the method, multiple three-dimensional, four-story steel special moment-resisting frames are considered. Closed-form relationships are fitted between frames' responses and the modeling parameters. Those fits are used to construct limit state functions to apply reliability analysis methods for design safety assessment and the selection of proper R. The frameworks' unique feature is to consider arbitrarily defined probability density functions of frames' modeling parameters with an insignificant analysis burden. This characteristic enables the alteration in those parameters' distributions to meet the design goal. Furthermore, with sensitivity analysis, the most impactful parameters are identifiable for possible improvements to meet the design goal. In the studied examples, it is revealed that a proper R for frames with different levels of uncertainties could be significantly different from suggested values in design codes, alarming the importance of considering the stochastic behavior of elements' nonlinear behavior.

Stochastics and Artificial Intelligence-based Analytics of Wastewater Plant Operation

  • Sung-Hyun Kwon;Daechul Cho
    • Clean Technology
    • /
    • v.29 no.2
    • /
    • pp.145-150
    • /
    • 2023
  • Tele-metering systems have been useful tools for managing domestic wastewater treatment plants (WWTP) over the last decade. They mostly generate water quality data for discharged water to ensure that it complies with mandatory regulations and they may be able to produce every operation parameter and additional measurements in the near future. A sub-big data group, comprised of about 150,000 data points from four domestic WWTPs, was ready to be classified and also analyzed to optimize the WWTP process. We used the Statistical Product and Service Solutions (SPSS) 25 package in order to statistically treat the data with linear regression and correlation analysis. The major independent variables for analysis were water temperature, sludge recycle rate, electricity used, and water quality of the influent while the dependent variables representing the water quality of the effluent included the total nitrogen, which is the most emphasized index for discharged flow in plants. The water temperature and consumed electricity showed a strong correlation with the total nitrogen but the other indices' mutual correlations with other variables were found to be fuzzy due to the large errors involved. In addition, a multilayer perceptron analysis method was applied to TMS data along with root mean square error (RMSE) analysis. This study showed that the RMSE in the SS, T-N, and TOC predictions were in the range of 10% to 20%.

Bivariate Oscillation Model for Surrogating Climate Change Scenarios in the LCRR basin

  • Lee, Taesam;Ouarda, Taha;Ahn, Yujin
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2021.06a
    • /
    • pp.69-69
    • /
    • 2021
  • From the unprecedented 2011 spring flood, the residens reside by Lake Champlain and Richelieu River encountered enormous damages. The International Joint Committee (IJC) released the Lake Champlain-Richelieu River (LCRR) Plan of Study (PoS). One of the major tasks for the PoS is to investigate the possible scenarios that might happen in the LCRR basin based on the stochastic simulation of the Net Basin Supplies that calculates the amount of flow into the lake and the river. Therefore, the current study proposed a novel apporach that simulate the annual NBS teleconnecting the climate index. The proposed model employed the bivariate empirical decomposition to contamporaneously model the long-term evolution of nonstationary oscillation embeded in the annual NBS and the climate signal (here, Artic Oscillation: AO). In order to represent the variational behavior of NBS correlation structure along with the temporal revolution of the climate index, a new nonstationary parameterization concept is proposed. The results indicate that the proposed model is superior performance in preserving long and short temporal correlation. It can even preserve the hurst coefficient better than any other tested models.

  • PDF

Reclaiming Multifaceted Financial Risk Information from Correlated Cash Flows under Uncertainty

  • Byung-Cheol Kim;Euysup Shim;Seong Jin Kim
    • International conference on construction engineering and project management
    • /
    • 2013.01a
    • /
    • pp.602-607
    • /
    • 2013
  • Financial risks associated with capital investments are often measured with different feasibility indicators such as the net present value (NPV), the internal rate of return (IRR), the payback period (PBP), and the benefit-cost ratio (BCR). This paper aims at demonstrating practical applications of probabilistic feasibility analysis techniques for an integrated feasibility evaluation of the IRR and PBP. The IRR and PBP are concurrently analyzed in order to measure the profitability and liquidity, respectively, of a cash flow. The cash flow data of a real wind turbine project is used in the study. The presented approach consists of two phases. First, two newly reported analysis techniques are used to carry out a series of what-if analyses for the IRR and PBP. Second, the relationship between the IRR and PBP is identified using Monte Carlo simulation. The results demonstrate that the integrated feasibility evaluation of stochastic cash flows becomes a more viable option with the aide of newly developed probabilistic analysis techniques. It is also shown that the relationship between the IRR and PBP for the wind turbine project can be used as a predictive model for the actual IRR at the end of the service life based on the actual PBP of the project early in the service life.

  • PDF

Multicriteria shape design of a sheet contour in stamping

  • Oujebbour, Fatima-Zahra;Habbal, Abderrahmane;Ellaia, Rachid;Zhao, Ziheng
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.3
    • /
    • pp.187-193
    • /
    • 2014
  • One of the hottest challenges in automotive industry is related to weight reduction in sheet metal forming processes, in order to produce a high quality metal part with minimal material cost. Stamping is the most widely used sheet metal forming process; but its implementation comes with several fabrication flaws such as springback and failure. A global and simple approach to circumvent these unwanted process drawbacks consists in optimizing the initial blank shape with innovative methods. The aim of this paper is to introduce an efficient methodology to deal with complex, computationally expensive multicriteria optimization problems. Our approach is based on the combination of methods to capture the Pareto Front, approximate criteria (to save computational costs) and global optimizers. To illustrate the efficiency, we consider the stamping of an industrial workpiece as test-case. Our approach is applied to the springback and failure criteria. To optimize these two criteria, a global optimization algorithm was chosen. It is the Simulated Annealing algorithm hybridized with the Simultaneous Perturbation Stochastic Approximation in order to gain in time and in precision. The multicriteria problems amounts to the capture of the Pareto Front associated to the two criteria. Normal Boundary Intersection and Normalized Normal Constraint Method are considered for generating a set of Pareto-optimal solutions with the characteristic of uniform distribution of front points. The computational results are compared to those obtained with the well-known Non-dominated Sorting Genetic Algorithm II. The results show that our proposed approach is efficient to deal with the multicriteria shape optimization of highly non-linear mechanical systems.

On the Numerical Stability of Dynamic Reliability Analysis Method (동적 신뢰성 해석 기법의 수치 안정성에 관하여)

  • Lee, Do-Geun;Ok, Seung-Yong
    • Journal of the Korean Society of Safety
    • /
    • v.35 no.3
    • /
    • pp.49-57
    • /
    • 2020
  • In comparison with the existing static reliability analysis methods, the dynamic reliability analysis(DyRA) method is more suitable for estimating the failure probability of a structure subjected to earthquake excitations because it can take into account the frequency characteristics and damping capacity of the structure. However, the DyRA is known to have an issue of numerical stability due to the uncertainty in random sampling of the earthquake excitations. In order to solve this numerical stability issue in the DyRA approach, this study proposed two earthquake-scale factors. The first factor is defined as the ratio of the first earthquake excitation over the maximum value of the remaining excitations, and the second factor is defined as the condition number of the matrix consisting of the earthquake excitations. Then, we have performed parametric studies of two factors on numerical stability of the DyRA method. In illustrative example, it was clearly confirmed that the two factors can be used to verify the numerical stability of the proposed DyRA method. However, there exists a difference between the two factors. The first factor showed some overlapping region between the stable results and the unstable results so that it requires some additional reliability analysis to guarantee the stability of the DyRA method. On the contrary, the second factor clearly distinguished the stable and unstable results of the DyRA method without any overlapping region. Therefore, the second factor can be said to be better than the first factor as the criterion to determine whether or not the proposed DyRA method guarantees its numerical stability. In addition, the accuracy of the numerical analysis results of the proposed DyRA has been verified in comparison with those of the existing first-order reliability method(FORM), Monte Carlo simulation(MCS) method and subset simulation method(SSM). The comparative results confirmed that the proposed DyRA method can provide accurate and reliable estimation of the structural failure probability while maintaining the superior numerical efficiency over the existing methods.

Weighted Integral Method for an Estimation of Displacement COV of Laminated Composite Plates (복합적층판의 변위 변동계수 산정을 위한 가중적분법)

  • Noh, Hyuk-Chun
    • Journal of the Korean Society for Advanced Composite Structures
    • /
    • v.1 no.2
    • /
    • pp.29-35
    • /
    • 2010
  • In addition to the Young's modulus, the Poisson's ratio is also at the center of attention in the field stochastic finite element analysis since the parameters play an important role in determining structural behavior. Accordingly, the sole effect of this parameter on the response variability is of importance from the perspective of estimation of uncertain response. To this end, a formulation to determine the response variability in laminate composite plates due to the spatial randomness of Poisson's ratio is suggested. The independent contributions of random Poisson's ratiocan be captured in terms of sub-matrices which include the effect of the random parameter in the same order, which can be attained by using the Taylor's series expansion about the mean of the parameter. In order to validate the adequacy of the proposed formulation, several example analyses are performed, and then the results are compared with Monte Carlo simulation (MCS). A good agreement between the suggested scheme and MCS is observed showing the adequacy of the scheme.

  • PDF

Studies on the Stochastic Generation of Long Term Runoff (1) (장기유출랑의 추계학적 모의 발생에 관한 연구 (I))

  • 이순혁;맹승진;박종국
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.35 no.3
    • /
    • pp.100-116
    • /
    • 1993
  • It is experienced fact that unreasonable design criterion and unsitable operation management for the agricultural structures including reservoirs based on short terms data of monthly flows have been brought about not only loss of lives, but also enormous property damage. For the solution of this point at issue, this study was conducted to simulate long series of synthetic monthly flows by multi-season first order Markov model with selection of best fitting frequency distribution and to make a comparison of statistical parameters between observed and synthetic flows of six watersheds in Yeong San and Seom Jin river systems. The results obtained through this study can be summarized as follows. 1.Both Gamma and two parameter lognormal distribution were found to be suitable ones for monthly flows in all watersheds by Kolmogorov-Smirnov test while those distributions were judged to be unfitness in Nam Pyeong of Yeong San and Song Jeong and Ab Rog watersheds of Seom Jin river systems in the $\chi$$^2$ goodness of fit test. 2.Most of the arithmetic mean values for synthetic monthly flows simulated by Gamma distribution are much closer to the results of the observed data than those of two parameter lognomal distribution in the applied watersheds. 3.Fluctuation for the coefficient of variation derived by Gamma distribution was shown in general as better agreement with the results of the observed data than that of two parameter lognormal distribution in the applied watersheds both in Yeong San and Seom Jin river systems. Especially, coefficients of variation calculated by Gamma distribution are seemed to be much closer to those of the observed data during July and August. 4.It can be concluded that synthetic monthly flows simulated by Gamma distribution are seemed to be much closer to the observed data than those by two parameter lognormal distribution in the applied watersheds. 5.It is to be desired that multi-season first order Markov model based on Gamma distribution which is confirmed as a good fitting one in this study would be compared with Harmonic synthetic model as a continuation follows.

  • PDF

Study on Estimations of Initial Mass Fractions of CH4/O2 in Diffusion-Controlled Turbulent Combustion Using Inverse Analysis (확산지배 난류 연소현상에서 역해석을 이용한 CH4/O2의 초기 질량분율 추정에 관한 연구)

  • Lee, Kyun-Ho;Baek, Seung-Wook
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.34 no.7
    • /
    • pp.679-688
    • /
    • 2010
  • The major objective of the present study is to extend the applications of inverse analysis to more realistic engineering fields with a complex combustion process rather than the traditional simple heat-transfer problems. In order to do this, the unknown initial mass fractions of $CH_4/O_2$ are estimated from the temperature measurement data by inverse analysis in the practical diffusion-controlled turbulent combustion problem. In order to ensure efficient inverse analysis, the repulsive particle swarm optimization (RPSO) method, which belongs to the class of stochastic evolutionary global optimization methods, is implemented as an inverse solver. Based on this study, it is expected that useful information can be obtained when inverse analysis is used in the diagnosis, design, or optimization of real combustion systems involving unknown parameters.

The Strength of the Relationship between Semantic Similarity and the Subcategorization Frames of the English Verbs: a Stochastic Test based on the ICE-GB and WordNet (영어 동사의 의미적 유사도와 논항 선택 사이의 연관성 : ICE-GB와 WordNet을 이용한 통계적 검증)

  • Song, Sang-Houn;Choe, Jae-Woong
    • Language and Information
    • /
    • v.14 no.1
    • /
    • pp.113-144
    • /
    • 2010
  • The primary goal of this paper is to find a feasible way to answer the question: Does the similarity in meaning between verbs relate to the similarity in their subcategorization? In order to answer this question in a rather concrete way on the basis of a large set of English verbs, this study made use of various language resources, tools, and statistical methodologies. We first compiled a list of 678 verbs that were selected from the most and second most frequent word lists from the Colins Cobuild English Dictionary, which also appeared in WordNet 3.0. We calculated similarity measures between all the pairs of the words based on the 'jcn' algorithm (Jiang and Conrath, 1997) implemented in the WordNet::Similarity module (Pedersen, Patwardhan, and Michelizzi, 2004). The clustering process followed, first building similarity matrices out of the similarity measure values, next drawing dendrograms on the basis of the matricies, then finally getting 177 meaningful clusters (covering 437 verbs) that passed a certain level set by z-score. The subcategorization frames and their frequency values were taken from the ICE-GB. In order to calculate the Selectional Preference Strength (SPS) of the relationship between a verb and its subcategorizations, we relied on the Kullback-Leibler Divergence model (Resnik, 1996). The SPS values of the verbs in the same cluster were compared with each other, which served to give the statistical values that indicate how much the SPS values overlap between the subcategorization frames of the verbs. Our final analysis shows that the degree of overlap, or the relationship between semantic similarity and the subcategorization frames of the verbs in English, is equally spread out from the 'very strongly related' to the 'very weakly related'. Some semantically similar verbs share a lot in terms of their subcategorization frames, and some others indicate an average degree of strength in the relationship, while the others, though still semantically similar, tend to share little in their subcategorization frames.

  • PDF