• Title/Summary/Keyword: Top Event Probability

Search Result 23, Processing Time 0.031 seconds

Calculation of Top Event Probability of Fault Tree using BDD (BDD를 이용한 사고수목 정상사상확률 계산)

  • Cho, Byeong Ho;Yum, Byeoungsoo;Kim, Sangahm
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.3
    • /
    • pp.654-662
    • /
    • 2016
  • As the number of gates and basic events in fault trees increases, it becomes difficult to calculate the exact probability of the top event. In order to overcome this difficulty the BDD methodology can be used to calculate the exact top event probability for small and medium size fault trees in short time. Fault trees are converted to BDD by using CUDD library functions and a failure path search algorithm is proposed to calculate the exact top event probability. The backward search algorithm is more efficient than the forward one in finding failure paths and in the calculation of the top event probability. This backward search algorithm can reduce searching time in the identification of disjoint failure paths from BDD and can be considered as an effective tool to find the cut sets and the minimal cut sets for the given fault trees.

A top-down iteration algorithm for Monte Carlo method for probability estimation of a fault tree with circular logic

  • Han, Sang Hoon
    • Nuclear Engineering and Technology
    • /
    • v.50 no.6
    • /
    • pp.854-859
    • /
    • 2018
  • Calculating minimal cut sets is a typical quantification method used to evaluate the top event probability for a fault tree. If minimal cut sets cannot be calculated or if the accuracy of the quantification result is in doubt, the Monte Carlo method can provide an alternative for fault tree quantification. The Monte Carlo method for fault tree quantification tends to take a long time because it repeats the calculation for a large number of samples. Herein, proposal is made to improve the quantification algorithm of a fault tree with circular logic. We developed a top-down iteration algorithm that combines the characteristics of the top-down approach and the iteration approach, thereby reducing the computation time of the Monte Carlo method.

FAST BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Jung, Woo-Sik;Han, Sang-Hoon;Yang, Joon-Eon
    • Nuclear Engineering and Technology
    • /
    • v.40 no.7
    • /
    • pp.571-580
    • /
    • 2008
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since it is highly memory consuming. In order to solve a large reliability problem within limited computational resources, many attempts have been made, such as static and dynamic variable ordering schemes, to minimize BDD size. Additional effort was the development of a ZBDD (Zero-suppressed BDD) algorithm to calculate an approximate TEP. The present method is the first successful application of a BDD truncation. The new method is an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. The benchmark tests demonstrate the efficiency of the developed method. The TEP rapidly converges to an exact value according to a lowered truncation limit.

NEW RESULTS TO BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Mo, Yuchang;Zhong, Farong;Zhao, Xiangfu;Yang, Quansheng;Cui, Gang
    • Nuclear Engineering and Technology
    • /
    • v.44 no.7
    • /
    • pp.755-766
    • /
    • 2012
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since its memory consumption is very high. Recently, in order to solve a large reliability problem within limited computational resources, Jung presented an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. In this paper, it is first identified that Jung's BDD truncation algorithm can be improved for a more practical use. Then, a more efficient truncation algorithm is proposed in this paper, which can generate truncated BDD with smaller size and approximate TEP with smaller truncation error. Empirical results showed this new algorithm uses slightly less running time and slightly more storage usage than Jung's algorithm. It was also found, that designing a truncation algorithm with ideal features for every possible fault tree is very difficult, if not impossible. The so-called ideal features of this paper would be that with the decrease of truncation limits, the size of truncated BDD converges to the size of exact BDD, but should never be larger than exact BDD.

Feasibility Study on the Fault Tree Analysis Approach for the Management of the Faults in Running PCR Analysis (PCR 과정의 오류 관리를 위한 Fault Tree Analysis 적용에 관한 시범적 연구)

  • Lim, Ji-Su;Park, Ae-Ri;Lee, Seung-Ju;Hong, Kwang-Won
    • Applied Biological Chemistry
    • /
    • v.50 no.4
    • /
    • pp.245-252
    • /
    • 2007
  • FTA (fault tree analysis), an analytical method for system failure management, was employed in the management of faults in running PCR analysis. PCR is executed through several processes, in which the process of PCR machine operation was selected for the analysis by FTA. The reason for choosing the simplest process in the PCR analysis was to adopt it as a first trial to test a feasibility of the FTA approach. First, fault events-top event, intermediate event, basic events-were identified by survey on expert knowledge of PCR. Then those events were correlated deductively to build a fault tree in hierarchical structure. The fault tree was evaluated qualitatively and quantitatively, yielding minimal cut sets, structural importance, common cause vulnerability, simulation of probability of occurrence of top event, cut set importance, item importance and sensitivity. The top event was 'errors in the step of PCR machine operation in running PCR analysis'. The major intermediate events were 'failures in instrument' and 'errors in actions in experiment'. The basic events were four events, one event and one event based on human errors, instrument failure and energy source failure, respectively. Those events were combined with Boolean logic gates-AND or OR, constructing a fault tree. In the qualitative evaluation of the tree, the basic events-'errors in preparing the reaction mixture', 'errors in setting temperature and time of PCR machine', 'failure of electrical power during running PCR machine', 'errors in selecting adequate PCR machine'-proved the most critical in the occurrence of the fault of the top event. In the quantitative evaluation, the list of the critical events were not the same as that from the qualitative evaluation. It was because the probability value of PCR machine failure, not on the list above though, increased with used time, and the probability of the events of electricity failure and defective of PCR machine were given zero due to rare likelihood of the events in general. It was concluded that this feasibility study is worth being a means to introduce the novel technique, FTA, to the management of faults in running PCR analysis.

Evaluation of Uncertainty Importance Measure by Experimental Method in Fault Tree Analysis (결점나무 분석에서 실험적 방법을 이용한 불확실성 중요도 측도의 평가)

  • Cho, Jae-Gyeun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.14 no.5
    • /
    • pp.187-195
    • /
    • 2009
  • In a fault tree analysis, an uncertainty importance measure is often used to assess how much uncertainty of the top event probability (Q) is attributable to the uncertainty of a basic event probability ($q_i$), and thus, to identify those basic events whose uncertainties need to be reduced to effectively reduce the uncertainty of Q. For evaluating the measures suggested by many authors which assess a percentage change in the variance V of Q with respect to unit percentage change in the variance $\upsilon_i$ of $q_i$, V and ${\partial}V/{\partial}{\upsilon}_i$ need to be estimated analytically or by Monte Carlo simulation. However, it is very complicated to analytically compute V and ${\partial}V/{\partial}{\upsilon}_i$ for large-sized fault trees, and difficult to estimate them in a robust manner by Monte Carlo simulation. In this paper, we propose a method for experimentally evaluating the measure using a Taguchi orthogonal array. The proposed method is very computationally efficient compared to the method based on Monte Carlo simulation, and provides a stable uncertainty importance of each basic event.

Risk Assessment and Application in Chemical Plants Using Fault Tree Analysis (FTA를 이용한 화학공장의 위험성 평가 및 응용)

  • Kim Yun-Hwa;Kim Ky-Soo;Yoon Sung-Ryul;Um Sung-In;Ko Jae-Wook
    • Journal of the Korean Institute of Gas
    • /
    • v.1 no.1
    • /
    • pp.81-86
    • /
    • 1997
  • This study is to estimate the possibility of accident in chemical plants from the analysis of system component which affects the occurrence of top event. Among the various risk assessment techniques, the Fault Tree Analysis which approaches deductively on the route of accident development was used in this study. By gate-by-gate method and minimal cut set, the qualitative and quantitative risk assessment for hazards in plants was performed. The probability of occurrence and frequency of top event was calculated from failure or reliability data of system components at stage of the quantitative risk assessment. In conclusion, the probability of accident was estimated according to logic pattern based on the Fault Tree Analysis. And the failure path which mostly influences on the occurrence of top event was found from Importance Analysis.

  • PDF

A fuzzy reasonal analysis of human reliability represented as fault tree structure

  • 김정만;이상도;이동춘
    • Journal of the Ergonomics Society of Korea
    • /
    • v.16 no.2
    • /
    • pp.1-14
    • /
    • 1997
  • In conventional probability-based human reliability analysis, the basic human error rates are modified by experts to consider the influences of many factors that affect human reliability. However, these influences are not easily represented quantitatively, because the relation between human reliability and each of these factors in not clear. In this paper, the relation is expressed quantitatively. Furthermore, human reliability is represented by error possibilities proposed by Onisawa, which is a fuzzy set on the interval [0,1]. Fuzzy reasoning is used in this method in order to obtain error possibilities. And, it is supposed that many basic events affected by the above factors are connected to the top event through Fault Tree structure, and an estimate of the top event expressed by a member- ship function is obtained by using the fuzzy measure and fuzzy integral. Finally, a numerical example of human reliability analysis obtained by this method is given.

  • PDF

Theoretical approach for uncertainty quantification in probabilistic safety assessment using sum of lognormal random variables

  • Song, Gyun Seob;Kim, Man Cheol
    • Nuclear Engineering and Technology
    • /
    • v.54 no.6
    • /
    • pp.2084-2093
    • /
    • 2022
  • Probabilistic safety assessment is widely used to quantify the risks of nuclear power plants and their uncertainties. When the lognormal distribution describes the uncertainties of basic events, the uncertainty of the top event in a fault tree is approximated with the sum of lognormal random variables after minimal cutsets are obtained, and rare-event approximation is applied. As handling complicated analytic expressions for the sum of lognormal random variables is challenging, several approximation methods, especially Monte Carlo simulation, are widely used in practice for uncertainty analysis. In this study, a theoretical approach for analyzing the sum of lognormal random variables using an efficient numerical integration method is proposed for uncertainty analysis in probability safety assessments. The change of variables from correlated random variables with a complicated region of integration to independent random variables with a unit hypercube region of integration is applied to obtain an efficient numerical integration. The theoretical advantages of the proposed method over other approximation methods are shown through a benchmark problem. The proposed method provides an accurate and efficient approach to calculate the uncertainty of the top event in probabilistic safety assessment when the uncertainties of basic events are described with lognormal random variables.

Evaluation of Uncertainty Importance Measure in Fault Tree Analysis (결점나무 분석에서 불확실성 중요도 측도의 평가)

  • Cho, Jae-Gyeun;Jeong, Seok-Chan
    • The Journal of Information Systems
    • /
    • v.17 no.3
    • /
    • pp.25-37
    • /
    • 2008
  • In a fault tree analysis, an uncertainty importance measure is often used to assess how much uncertainty of the top event probability (Q) is attributable to the uncertainty of a basic event probability ($q_i$), and thus, to identify those basic events whose uncertainties need to be reduced to effectively reduce the uncertainty of Q. For evaluating the measures suggested by many authors which assess a percentage change in the variance V of Q with respect to unit percentage change in the variance $v_i$ of $q_i$, V and ${\partial}V/{\partial}v_i$ need to be estimated analytically or by Monte Carlo simulation. However, it is very complicated to analytically compute V and ${\partial}V/{\partial}v_i$ for large-sized fault trees, and difficult to estimate them in a robust manner by Monte Carlo simulation. In this paper, we propose a method for evaluating the measure using discretization technique and Monte Carlo simulation. The proposed method provides a stable uncertainty importance of each basic event.