• Title/Summary/Keyword: Empirical Probability

Search Result 332, Processing Time 0.031 seconds

Empirical estimation of human error probabilities based on the complexity of proceduralized tasks in an analog environment

  • Park, Jinkyun;Kim, Hee Eun;Jang, Inseok
    • Nuclear Engineering and Technology
    • /
    • v.54 no.6
    • /
    • pp.2037-2047
    • /
    • 2022
  • The contribution of degraded human performance (e.g., human errors) is significant for the safety of diverse social-technical systems. Therefore, it is crucial to understand when and why the performance of human operators could be degraded. In this study, the occurrence probability of human errors was empirically estimated based on the complexity of proceduralized tasks. To this end, Logistic regression analysis was conducted to correlate TACOM (Task Complexity) scores with human errors collected from the full-scope training simulator of nuclear power plants equipped with analog devices (analog environment). As a result, it was observed that the occurrence probability of both errors of commission and errors of omission can be soundly estimated by TACOM scores. Since the effect of diverse performance influencing factors on the occurrence probabilities of human errors could be soundly distinguished by TACOM scores, it is also expected that TACOM scores can be used as a tool to explain when and why the performance of human operators starts to be degraded.

Examination of experimental errors in Scanlan derivatives of a closed-box bridge deck

  • Rizzo, Fabio;Caracoglia, Luca
    • Wind and Structures
    • /
    • v.26 no.4
    • /
    • pp.231-251
    • /
    • 2018
  • The objective of the investigation is the analysis of wind-tunnel experimental errors, associated with the measurement of aeroelastic coefficients of bridge decks (Scanlan flutter derivatives). A two-degree-of-freedom experimental apparatus is used for the measurement of flutter derivatives. A section model of a closed-box bridge deck is considered in this investigation. Identification is based on free-vibration aeroelastic tests and the Iterative Least Squares method. Experimental error investigation is carried out by repeating the measurements and acquisitions thirty times for each wind tunnel speed and configuration of the model. This operational procedure is proposed for analyzing the experimental variability of flutter derivatives. Several statistical quantities are examined; these quantities include the standard deviation and the empirical probability density function of the flutter derivatives at each wind speed. Moreover, the critical flutter speed of the setup is evaluated according to standard flutter theory by accounting for experimental variability. Since the probability distribution of flutter derivatives and critical flutter speed does not seem to obey a standard theoretical model, polynomial chaos expansion is proposed and used to represent the experimental variability.

Interconnection Network for Routing Distributed Video Stream on Popularity - Independent Multimedia-on-Demand Server (PIMODS서버에서 분산 비디오스트림의 전송을 위한 상호연결망)

  • 임강빈;류문간;신준호;김상중;최경희;정기현
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.36C no.11
    • /
    • pp.35-45
    • /
    • 1999
  • This paper presents an interconnection network for load balancing on a multimedia server and proposes a simple probabilistic model of the interconnection network for analysing the traffic characteristics. Because the switch uses deflection algorithm for routing, the traffic load on the switch seriously affects deflection probability. In this paper, we trace the deflection probability as a function of the traffic load according to the model. By comparing the result with the empirical result, we prove that the model is useful for estimating the deflection probability and traffic saturation point against the amount of packets getting into the switch.

  • PDF

Reliability Analysis of Offshore Guyed Tower Against Anchor Pile Failures (해양 가이드-타워의 고정말뚝에 대한 신뢰도 해석)

  • 류정선;윤정방;강성후
    • Computational Structural Engineering
    • /
    • v.4 no.3
    • /
    • pp.117-127
    • /
    • 1991
  • For the reliability analysis of offshore guyed towers for large storm events, failure of an anchor pile of the guyline system is investigated. Two failure modes of the anchor pile due to the extreme and the cyclic wave loadings are considered. The probability of failure due to the extreme anchor load is evaluated based on the first excursion probability analysis. Degradation of the pile capacity due to cyclic loadings is evaluated by using empirical fatigue curves for a driven pile in clay. The numerical results indicate that the failure probability due to the cyclic loadings can be as large as the risk due to extreme loading, particularly for the cases with the low design safety level of the pile strength and the large uncertainty of the pile resistance.

  • PDF

NEW RESULTS TO BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Mo, Yuchang;Zhong, Farong;Zhao, Xiangfu;Yang, Quansheng;Cui, Gang
    • Nuclear Engineering and Technology
    • /
    • v.44 no.7
    • /
    • pp.755-766
    • /
    • 2012
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since its memory consumption is very high. Recently, in order to solve a large reliability problem within limited computational resources, Jung presented an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. In this paper, it is first identified that Jung's BDD truncation algorithm can be improved for a more practical use. Then, a more efficient truncation algorithm is proposed in this paper, which can generate truncated BDD with smaller size and approximate TEP with smaller truncation error. Empirical results showed this new algorithm uses slightly less running time and slightly more storage usage than Jung's algorithm. It was also found, that designing a truncation algorithm with ideal features for every possible fault tree is very difficult, if not impossible. The so-called ideal features of this paper would be that with the decrease of truncation limits, the size of truncated BDD converges to the size of exact BDD, but should never be larger than exact BDD.

Performance Analysis of Economic VaR Estimation using Risk Neutral Probability Distributions

  • Heo, Se-Jeong;Yeo, Sung-Chil;Kang, Tae-Hun
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.5
    • /
    • pp.757-773
    • /
    • 2012
  • Traditional value at risk(S-VaR) has a difficulity in predicting the future risk of financial asset prices since S-VaR is a backward looking measure based on the historical data of the underlying asset prices. In order to resolve the deficiency of S-VaR, an economic value at risk(E-VaR) using the risk neutral probability distributions is suggested since E-VaR is a forward looking measure based on the option price data. In this study E-VaR is estimated by assuming the generalized gamma distribution(GGD) as risk neutral density function which is implied in the option. The estimated E-VaR with GGD was compared with E-VaR estimates under the Black-Scholes model, two-lognormal mixture distribution, generalized extreme value distribution and S-VaR estimates under the normal distribution and GARCH(1, 1) model, respectively. The option market data of the KOSPI 200 index are used in order to compare the performances of the above VaR estimates. The results of the empirical analysis show that GGD seems to have a tendency to estimate VaR conservatively; however, GGD is superior to other models in the overall sense.

Identification of the associations between genes and quantitative traits using entropy-based kernel density estimation

  • Yee, Jaeyong;Park, Taesung;Park, Mira
    • Genomics & Informatics
    • /
    • v.20 no.2
    • /
    • pp.17.1-17.11
    • /
    • 2022
  • Genetic associations have been quantified using a number of statistical measures. Entropy-based mutual information may be one of the more direct ways of estimating the association, in the sense that it does not depend on the parametrization. For this purpose, both the entropy and conditional entropy of the phenotype distribution should be obtained. Quantitative traits, however, do not usually allow an exact evaluation of entropy. The estimation of entropy needs a probability density function, which can be approximated by kernel density estimation. We have investigated the proper sequence of procedures for combining the kernel density estimation and entropy estimation with a probability density function in order to calculate mutual information. Genotypes and their interactions were constructed to set the conditions for conditional entropy. Extensive simulation data created using three types of generating functions were analyzed using two different kernels as well as two types of multifactor dimensionality reduction and another probability density approximation method called m-spacing. The statistical power in terms of correct detection rates was compared. Using kernels was found to be most useful when the trait distributions were more complex than simple normal or gamma distributions. A full-scale genomic dataset was explored to identify associations using the 2-h oral glucose tolerance test results and γ-glutamyl transpeptidase levels as phenotypes. Clearly distinguishable single-nucleotide polymorphisms (SNPs) and interacting SNP pairs associated with these phenotypes were found and listed with empirical p-values.

An Analysis on Argumentation in the Task Context of 'Monty Hall Problem' at a High School Probability Class (고등학교 확률 수업의 '몬티홀 문제' 과제 맥락에서 나타난 논증과정 분석)

  • Lee, Yoon-Kyung;Cho, Cheong-Soo
    • School Mathematics
    • /
    • v.17 no.3
    • /
    • pp.423-446
    • /
    • 2015
  • This study aims to look into the characteristics of argumentation in the task context of 'Monty Hall problem' at a high school probability class. As a result of an analysis of classroom discourses on the argumentation between teachers and second-year students in one upper level class in high school using Toulmin's argument pattern, it was found that it would be important to create a task context and a safe classroom culture in which the students could ask questions and refute them in order to make it an argument-centered discourse community. In addition, through the argumentation of solving complex problems together, the students could be further engaged in the class, and the actual empirical context enriched the understanding of concepts. However, reasoning in argumentation was mostly not a statistical one, but a mathematical one centered around probability problem-solving. Through these results of the study, it was noted that the teachers should help the students actively participate in argumentation through the task context and question, and an understanding of a statistical reasoning of interpreting the context would be necessary in order to induce their thinking and reasoning about probability and statistics.

A Study on Properties of Crude Oil Based Derivative Linked Security (유가 연계 파생결합증권의 특성에 대한 연구)

  • Sohn, Kyoung-Woo;Chung, Ji-Yeong
    • Asia-Pacific Journal of Business
    • /
    • v.11 no.3
    • /
    • pp.243-260
    • /
    • 2020
  • Purpose - This paper aims to investigate the properties of crude oil based derivative security (DLS) focusing on step-down type for comprehensive understanding of its risk. Design/methodology/approach - Kernel estimation is conducted to figure out statistical feature of the process of oil price. We simulate oil price paths based on kernel estimation results and derive probabilities of hitting the barrier and early redemption. Findings - The amount of issuance for crude oil based DLS is relatively low when base prices are below $40 while it is high when base prices are around $60 or $100, which is not consistent with kernel estimation results showing that oil futures prices tend to revert toward $46.14 and the mean-reverting speed is faster as oil price is lower. The analysis based on simulated oil price paths reveals that probability of early redemption is below 50% for DLS with high base prices and the ratio of the probability of early redemption to the probability of hitting barrier is remarkably low compared to the case for DLS with low base prices, as the chance of early redemption is deferred. Research implications or Originality - Empirical results imply that the level of the base price is a crucial factor of the risk for DLS, thus introducing a time-varying knock-in barrier, which is similar to adjust the base price, merits consideration to enhance protection for DLS investors.

An Empirical Study on the Relationship between Market Feasibility Levels and Technology Variables from Technology Competitiveness Assessment (기술력평가에서 사업성수준과 기술성변수간 연관성에 관한 실증연구)

  • Sung Oong-Hyun
    • Journal of Korean Society for Quality Management
    • /
    • v.32 no.3
    • /
    • pp.198-215
    • /
    • 2004
  • Technology competitiveness evaluates environmental and engineered technology and process at both the scientific and market levels. There are increasing concerns to measure the effects of the technology variables on the potential market feasibility levels. However, there are very little empirical analysis studies on that issue. This study investigates the impacts of technology variables on the levels of market feasibility based on 230 data obtained from Korea Technology Transfer Center. As various statistical analysis, the canonical discriminant model, logit discriminant model and classification model were used and their results were compared. This study results showed that major technology variables had very significant relations to discriminate high and low categories of market feasibility. Finally, this study will help building management strategies to level up the potential market performance and also help financial Institutions to decide funds needed for small-sized technology firms.