• Title/Summary/Keyword: Probabilistic sensitivity

Search Result 217, Processing Time 0.021 seconds

Reliability-Based Topology Optimization Using Performance Measure Approach (성능함수법을 이용한 신뢰성기반 위상 최적설계)

  • Ahn, Seung-Ho;Cho, Seon-Ho
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.23 no.1
    • /
    • pp.37-43
    • /
    • 2010
  • In this paper, a reliability-based design optimization is developed for the topology design of linear structures using a performance measure approach. Spatial domain is discretized using three dimensional Reissner-Mindlin plate elements and design variable is taken as the material property of each element. A continuum based adjoint variable method is employed for the efficient computation of sensitivity with respect to the design and random variables. The performance measure approach of RBDO is employed to evaluate the probabilistic constraints. The topology optimizationproblem is formulated to have probabilistic displacement constraints. The uncertainties such as material property and external loads are considered. Numerical examples show that the developed topology optimization method could effectively yield a reliable design, comparing with the other methods such as deterministic, safety factor, and worst case approaches.

Analysis of Consolidation considering Uncertainties of Geotechnical Parameters and Reliability method (지반특성의 불확실성과 신뢰성 기법을 고려한 압밀해석)

  • Lee, Kyu-Hwan
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.11 no.4
    • /
    • pp.138-146
    • /
    • 2007
  • Geotechnical performance at the soft ground is strongly dependent on the properties of the soil beneath and adjacent to the structure of interest. These soil properties can be described using deterministic and/or probabilistic models. Deterministic models typically use a single discrete descriptor for the parameter of interest. Probabilistic models describe parameters by using discrete statistical descriptors or probability distribution density functions. The consolidation process depends on several uncertain parameters including the coefficients of consolidation and coefficients of permeability in vertical and horizontal directions. The implication of this uncertain parameter in the design of prefabricated vertical drains for soil improvement is discussed. A sensitivity analysis of the degree of consolidation and calculation of settlements to these uncertain parameters is presented for clayey deposits.

Retrofit strategy issues for structures under earthquake loading using sensitivity-optimization procedures

  • Manolis, G.D.;Panagiotopoulos, C.G.;Paraskevopoulos, E.A.;Karaoulanis, F.E.;Vadaloukas, G.N.;Papachristidis, A.G.
    • Earthquakes and Structures
    • /
    • v.1 no.1
    • /
    • pp.109-127
    • /
    • 2010
  • This work aims at introducing structural sensitivity analysis capabilities into existing commercial finite element software codes for the purpose of mapping retrofit strategies for a broad group of structures including heritage-type buildings. More specifically, the first stage sensitivity analysis is implemented for the standard deterministic environment, followed by stochastic structural sensitivity analysis defined for the probabilistic environment in a subsequent, second phase. It is believed that this new generation of software that will be released by the industrial partner will address the needs of a rapidly developing specialty within the engineering design profession, namely commercial retrofit and rehabilitation activities. In congested urban areas, these activities are carried out in reference to a certain percentage of the contemporary building stock that can no longer be demolished to give room for new construction because of economical, historical or cultural reasons. Furthermore, such analysis tools are becoming essential in reference to a new generation of national codes that spell out in detail how retrofit strategies ought to be implemented. More specifically, our work focuses on identifying the minimum-cost intervention on a given structure undergoing retrofit. Finally, an additional factor that arises in earthquake-prone regions across the world is the random nature of seismic activity that further complicates the task of determining the dynamic overstress that is being induced in the building stock and the additional demands placed on the supporting structural system.

Probabilistic finite Element Analysis of Eigenvalue Problem- Buckling Reliability Analysis of Frame Structure- (고유치 문제의 확률 유한요소 해석)

  • 양영순;김지호
    • Computational Structural Engineering
    • /
    • v.4 no.2
    • /
    • pp.111-117
    • /
    • 1991
  • The analysis method calculating the mean and standard deviation for the eigenvalue of complicated structures in which the limit state equation is implicitly expressed is formulated and applied to the buckling analysis by combining probabilistic finite element method with direct differential method which is a kind of sensitivity analysis technique. Also, the probability of buckling failure is calculated by combining classical reliability techniques such a MVFOSM and AFOSM. As random variables external load, elastic modulus, sectional moment of inertia and member length are chosen and Parkinson's iteration algorithm in AFOSM is used. The accuracy of the results by this study is verified by comparing the results with the crude Monte Carlo simulation and Importance Sampling Method. Through the case study of some structures the important aspects of buckling reliability analysis are discussed.

  • PDF

Random imperfection effect on reliability of space structures with different supports

  • Roudsari, Mehrzad Tahamouli;Gordini, Mehrdad
    • Structural Engineering and Mechanics
    • /
    • v.55 no.3
    • /
    • pp.461-472
    • /
    • 2015
  • The existence of initial imperfections in manufacturing or assembly of double-layer space structures having hundreds or thousands of members is inevitable. Many of the imperfections, such as the initial curvature of the members and residual stresses in members, are all random in nature. In this paper, the probabilistic effect of initial curvature imperfections in the load bearing capacity of double-layer grid space structures with different types of supports have been investigated. First, for the initial curvature imperfection of each member, a random number is generated from a gamma distribution. Then, by employing the same probabilistic model, the imperfections are randomly distributed amongst the members of the structure. Afterwards, the collapse behavior and the ultimate bearing capacity of the structure are determined by using nonlinear push down analysis and this procedure is frequently repeated. Ultimately, based on the maximum values of bearing capacity acquired from the analysis of different samples, structure's reliability is obtained by using Monte Carlo simulation method. The results show the sensitivity of the collapse behavior of double-layer grid space structures to the random distribution of initial imperfections and supports type.

A Study On the Rrobabilistic Nature of Fatigue Crack Propagation Life(I) -The Effect of Distribution of Initial Crack Size- (피로크랙 진전수명의 확률특성에 관한 연구 I -초기크랙길이 분포의 영향-)

  • 윤한용
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.14 no.1
    • /
    • pp.138-144
    • /
    • 1990
  • In order to understand the probabilistic nature of fatigue crack propagation, not only the calculation of failure probability and parameter sensitivity, but also the clarification of probabilistic nature of various parameters should be executed. Therefore a method to evalute synthetically the effect of each parameter on the distribution of fatigue crack propagation life is required. In this study, the effects of the initial crack size and other paramaters on the distribution of fatigue crack propagation life are discussed according to the appropriate normalization of the life distribution, the validity of this method is also shown. Such an investigation as the present work may be useful to understand the nature of the life distribution and to utilize the probailistic fracture mechanics.

A Domain Combination Based Probabilistic Framework for Protein-Protein Interaction Prediction (도메인 조합 기반 단백질-단백질 상호작용 확률 예측기법)

  • Han, Dong-Soo;Seo, Jung-Min;Kim, Hong-Soog;Jang, Woo-Hyuk
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2003.10a
    • /
    • pp.7-16
    • /
    • 2003
  • In this paper, we propose a probabilistic framework to predict the interaction probability of proteins. The notion of domain combination and domain combination pair is newly introduced and the prediction model in the framework takes domain combination pair as a basic unit of protein interactions to overcome the limitations of the conventional domain pair based prediction systems. The framework largely consists of prediction preparation and service stages. In the prediction preparation stage, two appearance pro-bability matrices, which hold information on appearance frequencies of domain combination pairs in the interacting and non-interacting sets of protein pairs, are constructed. Based on the appearance probability matrix, a probability equation is devised. The equation maps a protein pair to a real number in the range of 0 to 1. Two distributions of interacting and non-interacting set of protein pairs are obtained using the equation. In the prediction service stage, the interaction probability of a protein pair is predicted using the distributions and the equation. The validity of the prediction model is evaluated fur the interacting set of protein pairs in Yeast organism and artificially generated non-interacting set of protein pairs. When 80% of the set of interacting protein pairs in DIP database are used as foaming set of interacting protein pairs, very high sensitivity(86%) and specificity(56%) are achieved within our framework.

  • PDF

Focal Depth Factors in the PSH Analysis

  • Kim, Jun-Kyoung
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.2 no.3
    • /
    • pp.83-86
    • /
    • 1998
  • The results from the Individual Plant Examination of External Event of Yonggwyang nuclear power plants, unit 3 & 4, in Korea have shown that the high degree of diversities of the experts' opinions on seismicity and attenuation models is su, pp.sed to be generic cause of uncertainty of APEs(annual exceedance probability) in the PAHA(probabilistic seismic hazard analysis). This study investigated the sensitivity of the focal depth, which is one of the most uncertain seismicity parameters in Korea, Significant differences in resultant values of annual exceedance probabilities and much more symmetrical shape of the resultant PDFs(probability density functions), in case of consideration of focal depth, are found. These two results suggest that, even for the same seismic input data set including the seismicity models and ground motion attenuation models, to consider focal depth additionally for probabilistic seismic hazard analysis evaluation makes significant influence on the distributions of uncertainties and probabilities of exceedance per year for the whole ranges of seismic hazard levels. These facts suggest that it is necessary to derive focal depth parameter more effectively from the historical and instrumental documents on earthquake phenomena in Koran Peninsula for the future study of PSHA.

  • PDF

An Assessment on the Containment Integrity of Korean Standard Nuclear Power Plants Against Direct Containment Heating Loads

  • Seo, Kyung-Woo;Kim, Moo-Hwan;Lee, Byung-Chul;Jeun, Gyoo-Dong
    • Nuclear Engineering and Technology
    • /
    • v.33 no.5
    • /
    • pp.468-482
    • /
    • 2001
  • As a process of Direct Containment Heating (DCH) issue resolution for Korean Standard Nuclear Power Plants (KSNPs), a containment load/strength assessment with two different approaches, the probabilistic and the deterministic, was performed with all plant-specific and phenomena-specific data. In case of the probabilistic approach, the framework developed to support the Zion DCH study, Two-Cell Equilibrium (TCE) coupled with Latin Hypercubic Sampling (LHS), provided a very efficient tool to resolve DCH issue. In case of the deterministic approach, the evaluation methodology using the sophisticated mechanistic computer code, CONTAIN 2.0 was developed, based on findings from DCH-related experiments or analyses. For three bounding scenarios designated as Scenarios V, Va, and VI, the calculation results of TCE/LHS and CONTAIN 2.0 with the conservatism or typical estimation for uncertain parameters, showed that the containment failure resulted from DCH loads was not likely to occur. To verify that these two approaches might be conservative , the containment loads resulting from typical high-pressure accident scenarios (SBO and SBLOCA) for KSNPs were also predicted. The CONTAIN 2.0 calculations with boundary and initial conditions from the MAAP4 predictions, including the sensitivity calculations for DCH phenomenological parameters, have confirmed that the predicted containment pressure and temperature were much below those from these two approaches, and, therefore, DCH issue for KSNPS might be not a problem.

  • PDF

Composite Dependency-reflecting Model for Core Promoter Recognition in Vertebrate Genomic DNA Sequences

  • Kim, Ki-Bong;Park, Seon-Hee
    • BMB Reports
    • /
    • v.37 no.6
    • /
    • pp.648-656
    • /
    • 2004
  • This paper deals with the development of a predictive probabilistic model, a composite dependency-reflecting model (CDRM), which was designed to detect core promoter regions and transcription start sites (TSS) in vertebrate genomic DNA sequences, an issue of some importance for genome annotation. The model actually represents a combination of first-, second-, third- and much higher order or long-range dependencies obtained using the expanded maximal dependency decomposition (EMDD) procedure, which iteratively decomposes data sets into subsets on the basis of dependency degree and patterns inherent in the target promoter region to be modeled. In addition, decomposed subsets are modeled by using a first-order Markov model, allowing the predictive model to reflect dependency between adjacent positions explicitly. In this way, the CDRM allows for potentially complex dependencies between positions in the core promoter region. Such complex dependencies may be closely related to the biological and structural contexts since promoter elements are present in various combinations separated by various distances in the sequence. Thus, CDRM may be appropriate for recognizing core promoter regions and TSSs in vertebrate genomic contig. To demonstrate the effectiveness of our algorithm, we tested it using standardized data and real core promoters, and compared it with some current representative promoter-finding algorithms. The developed algorithm showed better accuracy in terms of specificity and sensitivity than the promoter-finding ones used in performance comparison.