• Title/Summary/Keyword: Top Event

Search Result 173, Processing Time 0.026 seconds

A COG Variable Analysis of Air-rolling-breakfall in Judo (유도 공중회전낙법의 COG변인 분석)

  • Kim, Eui-Hwan;Chung, Chae-Wook;Kim, Sung-Sup
    • Korean Journal of Applied Biomechanics
    • /
    • v.15 no.3
    • /
    • pp.117-132
    • /
    • 2005
  • It was to study a following research of "A Kinematic Analysis of Air-rolling-breakfall in Judo". The purpose of this study was to analyze the Center of Gravity(COG) variables when performing Air-rolling-breakfall motion, while passing forward over(PFO) to the vertical-hurdles(2m height, take off board 1m height) in judo. Subjects were four males of Y. University squad, who were trainees of the demonstration exhibition team, representatives of national level judoists and were filmed by four 5-VHS 16mm video cameras(60field/sec.) through the three dimensional film analysis methods.COG variable were anterior-posterior directional COG and linear velocity of COG, vertical directional COG and linear velocity of COG. The data collections of this study were digitized by KWON3D program computed The data were standardized using cubic spline interpolation based by calculating the mean values and the standard deviation calculated for each variables. When performing the Air-rolling-breakfall, from the data analysis and discussions, the conclusions were as follows : 1. Anterior-posterior directional COG(APD-COG) when performing Air-rolling-breakfall motion, while PFO over to the vertical-hurdles(2m height) in judo. The range of APD-COG by forward was $0.31{\sim}0.41m$ in take-off position(event 1), $1.20{\sim}1.33m$ in the air-top position(event 2), $2.12{\sim}2.30m$ in the touch-down position(event 3), gradually and $2.14{\sim}2.32m$ in safety breakfall position(event 4), respectively. 2 The linear velocity of APD-COG was $1.03{\sim}2.14m/sec$. in take-off position(event 1), $1.97{\sim}2.22m/sec$. gradually in the air-top position(event 2), $1.05{\sim}1.32m/sec$. in the touch-down position (event 3), gradual decrease and $0.91{\sim}1.23m/sec$. in the safety breakfall position(event 4), respectively. 3. The vertical directional COG(VD-COG) when performing Air-rolling-breakfall motion, while PFO to the vertical-hurdles(2m height) in judo. The range of VD-COG toward upward from mat was $1.35{\sim}1.46m$ in take-off position(event 1), the highest $2.07{\sim}2.23m$ in the air-top position(event 2), and after rapid decrease $0.3{\sim}0.58m$ in the touch-down position(event 3), gradual decrease $0.22{\sim}0.50m$ in safety breakfall position(event 4), respectively. 4. The linear velocity of VlJ.COG was $1.60{\sim}1.87m/sec$. in take-off position(event 1), $0.03{\sim}0.08m/sec$. gradually in the air-top position(event 2), $-4.37{\sim}\;-4.76m/sec$. gradual decrease in the touch-down position(event 3), gradual decrease and -4.40${\sim}\;-4.77m/sec$. in safety breakfall position(event 4), respectively. When performing Air-rolling-breakfall showed parabolic movement from take-off position to air-top position, and after showed vertical fall movement from air-top position to safety breakfall. In conclusion, Ukemi(breakfall) is safety fall method Therefore, actions need for performing safety fall movement, that decrease and minimize shock and impact during Air-rolling-breakfall from take-off board action to air-top position must be maximize of angular momentum, and after must be minimize in touch-down position and safety breakfall position.

Calculation of Top Event Probability of Fault Tree using BDD (BDD를 이용한 사고수목 정상사상확률 계산)

  • Cho, Byeong Ho;Yum, Byeoungsoo;Kim, Sangahm
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.3
    • /
    • pp.654-662
    • /
    • 2016
  • As the number of gates and basic events in fault trees increases, it becomes difficult to calculate the exact probability of the top event. In order to overcome this difficulty the BDD methodology can be used to calculate the exact top event probability for small and medium size fault trees in short time. Fault trees are converted to BDD by using CUDD library functions and a failure path search algorithm is proposed to calculate the exact top event probability. The backward search algorithm is more efficient than the forward one in finding failure paths and in the calculation of the top event probability. This backward search algorithm can reduce searching time in the identification of disjoint failure paths from BDD and can be considered as an effective tool to find the cut sets and the minimal cut sets for the given fault trees.

A top-down iteration algorithm for Monte Carlo method for probability estimation of a fault tree with circular logic

  • Han, Sang Hoon
    • Nuclear Engineering and Technology
    • /
    • v.50 no.6
    • /
    • pp.854-859
    • /
    • 2018
  • Calculating minimal cut sets is a typical quantification method used to evaluate the top event probability for a fault tree. If minimal cut sets cannot be calculated or if the accuracy of the quantification result is in doubt, the Monte Carlo method can provide an alternative for fault tree quantification. The Monte Carlo method for fault tree quantification tends to take a long time because it repeats the calculation for a large number of samples. Herein, proposal is made to improve the quantification algorithm of a fault tree with circular logic. We developed a top-down iteration algorithm that combines the characteristics of the top-down approach and the iteration approach, thereby reducing the computation time of the Monte Carlo method.

Feasibility Study on the Fault Tree Analysis Approach for the Management of the Faults in Running PCR Analysis (PCR 과정의 오류 관리를 위한 Fault Tree Analysis 적용에 관한 시범적 연구)

  • Lim, Ji-Su;Park, Ae-Ri;Lee, Seung-Ju;Hong, Kwang-Won
    • Applied Biological Chemistry
    • /
    • v.50 no.4
    • /
    • pp.245-252
    • /
    • 2007
  • FTA (fault tree analysis), an analytical method for system failure management, was employed in the management of faults in running PCR analysis. PCR is executed through several processes, in which the process of PCR machine operation was selected for the analysis by FTA. The reason for choosing the simplest process in the PCR analysis was to adopt it as a first trial to test a feasibility of the FTA approach. First, fault events-top event, intermediate event, basic events-were identified by survey on expert knowledge of PCR. Then those events were correlated deductively to build a fault tree in hierarchical structure. The fault tree was evaluated qualitatively and quantitatively, yielding minimal cut sets, structural importance, common cause vulnerability, simulation of probability of occurrence of top event, cut set importance, item importance and sensitivity. The top event was 'errors in the step of PCR machine operation in running PCR analysis'. The major intermediate events were 'failures in instrument' and 'errors in actions in experiment'. The basic events were four events, one event and one event based on human errors, instrument failure and energy source failure, respectively. Those events were combined with Boolean logic gates-AND or OR, constructing a fault tree. In the qualitative evaluation of the tree, the basic events-'errors in preparing the reaction mixture', 'errors in setting temperature and time of PCR machine', 'failure of electrical power during running PCR machine', 'errors in selecting adequate PCR machine'-proved the most critical in the occurrence of the fault of the top event. In the quantitative evaluation, the list of the critical events were not the same as that from the qualitative evaluation. It was because the probability value of PCR machine failure, not on the list above though, increased with used time, and the probability of the events of electricity failure and defective of PCR machine were given zero due to rare likelihood of the events in general. It was concluded that this feasibility study is worth being a means to introduce the novel technique, FTA, to the management of faults in running PCR analysis.

FAST BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Jung, Woo-Sik;Han, Sang-Hoon;Yang, Joon-Eon
    • Nuclear Engineering and Technology
    • /
    • v.40 no.7
    • /
    • pp.571-580
    • /
    • 2008
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since it is highly memory consuming. In order to solve a large reliability problem within limited computational resources, many attempts have been made, such as static and dynamic variable ordering schemes, to minimize BDD size. Additional effort was the development of a ZBDD (Zero-suppressed BDD) algorithm to calculate an approximate TEP. The present method is the first successful application of a BDD truncation. The new method is an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. The benchmark tests demonstrate the efficiency of the developed method. The TEP rapidly converges to an exact value according to a lowered truncation limit.

Evaluation of Uncertainty Importance Measure by Experimental Method in Fault Tree Analysis (결점나무 분석에서 실험적 방법을 이용한 불확실성 중요도 측도의 평가)

  • Cho, Jae-Gyeun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.14 no.5
    • /
    • pp.187-195
    • /
    • 2009
  • In a fault tree analysis, an uncertainty importance measure is often used to assess how much uncertainty of the top event probability (Q) is attributable to the uncertainty of a basic event probability ($q_i$), and thus, to identify those basic events whose uncertainties need to be reduced to effectively reduce the uncertainty of Q. For evaluating the measures suggested by many authors which assess a percentage change in the variance V of Q with respect to unit percentage change in the variance $\upsilon_i$ of $q_i$, V and ${\partial}V/{\partial}{\upsilon}_i$ need to be estimated analytically or by Monte Carlo simulation. However, it is very complicated to analytically compute V and ${\partial}V/{\partial}{\upsilon}_i$ for large-sized fault trees, and difficult to estimate them in a robust manner by Monte Carlo simulation. In this paper, we propose a method for experimentally evaluating the measure using a Taguchi orthogonal array. The proposed method is very computationally efficient compared to the method based on Monte Carlo simulation, and provides a stable uncertainty importance of each basic event.

Practical modeling and quantification of a single-top fire events probabilistic safety assessment model

  • Dae Il Kang;Yong Hun Jung
    • Nuclear Engineering and Technology
    • /
    • v.55 no.6
    • /
    • pp.2263-2275
    • /
    • 2023
  • In general, an internal fire events probabilistic safety assessment (PSA) model is quantified by modifying the pre-existing internal event PSA model. Because many pieces of equipment or cables can be damaged by a fire, a single fire event can lead to multiple internal events PSA initiating events (IEs). Consequently, when the fire events PSA model is quantified, inappropriate minimal cut sets (MCSs), such as duplicate MCSs, may be generated. This paper shows that single quantification of a hypothetical single-top fire event PSA model may generate the following four types of inappropriate MCSs: duplicate MCSs, MCSs subsumed by other MCSs, nonsense MCSs, and MCSs with over-counted fire frequencies. Among the inappropriate MCSs, the nonsense MCSs should be addressed first because they can interfere with the right interpretation of the other MCSs and prevent the resolution of the issues related to the other inappropriate MCSs. In addition, we propose a resolution process for each of the issues caused by these inappropriate MCSs and suggest an overall procedure for resolving them. The results of this study will contribute to the understanding and resolution of the inappropriate MCSs that may appear in the quantification of fire events PSA models.

FTA 기법을 이용한 Compresson고장 진단

  • 배용환
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1993.04b
    • /
    • pp.305-309
    • /
    • 1993
  • The application of fault tree technique to the analysis of compressor failure is considered. The techniques involve the decomposition of the system into a logic diagram or fault tree in whichcertain basic or primary events lead to a specified top event which signifiss the total failure of the system. The fault trees are used to obtain miniumal cut sets from whichthe modes of system failure and, hence the reliability for the top event can be calculated. The method of constructing fault trees and the subsequent estimation of reliability of the system is illustrated through a compressor failure. FTA is roved to be efficient to investigate the compressor fault train.

Adapted Sequential Pattern Mining Algorithms for Business Service Identification (비즈니스 서비스 식별을 위한 변형 순차패턴 마이닝 알고리즘)

  • Lee, Jung-Won
    • Journal of the Korea Society of Computer and Information
    • /
    • v.14 no.4
    • /
    • pp.87-99
    • /
    • 2009
  • The top-down method for SOA delivery is recommended as a best way to take advantage of SOA. The core step of SOA delivery is the step of service modeling including service analysis and design based on ontology. Most enterprises know that the top-down approach is the best but they are hesitant to employ it because it requires them to invest a great deal of time and money without it showing any immediate results, particularly because they use well-defined component based systems. In this paper, we propose a service identification method to use a well-defined components maximally as a bottom-up approach. We assume that user's inputs generates events on a GUI and the approximate business process can be obtained from concatenating the event paths. We first find the core GUIs which have many outgoing event calls and form event paths by concatenating the event calls between the GUIs. Next, we adapt sequential pattern mining algorithms to find the maximal frequent event paths. As an experiment, we obtained business services with various granularity by applying a cohesion metric to extracted frequent event paths.

NEW RESULTS TO BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Mo, Yuchang;Zhong, Farong;Zhao, Xiangfu;Yang, Quansheng;Cui, Gang
    • Nuclear Engineering and Technology
    • /
    • v.44 no.7
    • /
    • pp.755-766
    • /
    • 2012
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since its memory consumption is very high. Recently, in order to solve a large reliability problem within limited computational resources, Jung presented an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. In this paper, it is first identified that Jung's BDD truncation algorithm can be improved for a more practical use. Then, a more efficient truncation algorithm is proposed in this paper, which can generate truncated BDD with smaller size and approximate TEP with smaller truncation error. Empirical results showed this new algorithm uses slightly less running time and slightly more storage usage than Jung's algorithm. It was also found, that designing a truncation algorithm with ideal features for every possible fault tree is very difficult, if not impossible. The so-called ideal features of this paper would be that with the decrease of truncation limits, the size of truncated BDD converges to the size of exact BDD, but should never be larger than exact BDD.