• 제목/요약/키워드: Empirical Probability

검색결과 332건 처리시간 0.027초

Empirical estimation of human error probabilities based on the complexity of proceduralized tasks in an analog environment

  • Park, Jinkyun;Kim, Hee Eun;Jang, Inseok
    • Nuclear Engineering and Technology
    • /
    • 제54권6호
    • /
    • pp.2037-2047
    • /
    • 2022
  • The contribution of degraded human performance (e.g., human errors) is significant for the safety of diverse social-technical systems. Therefore, it is crucial to understand when and why the performance of human operators could be degraded. In this study, the occurrence probability of human errors was empirically estimated based on the complexity of proceduralized tasks. To this end, Logistic regression analysis was conducted to correlate TACOM (Task Complexity) scores with human errors collected from the full-scope training simulator of nuclear power plants equipped with analog devices (analog environment). As a result, it was observed that the occurrence probability of both errors of commission and errors of omission can be soundly estimated by TACOM scores. Since the effect of diverse performance influencing factors on the occurrence probabilities of human errors could be soundly distinguished by TACOM scores, it is also expected that TACOM scores can be used as a tool to explain when and why the performance of human operators starts to be degraded.

Examination of experimental errors in Scanlan derivatives of a closed-box bridge deck

  • Rizzo, Fabio;Caracoglia, Luca
    • Wind and Structures
    • /
    • 제26권4호
    • /
    • pp.231-251
    • /
    • 2018
  • The objective of the investigation is the analysis of wind-tunnel experimental errors, associated with the measurement of aeroelastic coefficients of bridge decks (Scanlan flutter derivatives). A two-degree-of-freedom experimental apparatus is used for the measurement of flutter derivatives. A section model of a closed-box bridge deck is considered in this investigation. Identification is based on free-vibration aeroelastic tests and the Iterative Least Squares method. Experimental error investigation is carried out by repeating the measurements and acquisitions thirty times for each wind tunnel speed and configuration of the model. This operational procedure is proposed for analyzing the experimental variability of flutter derivatives. Several statistical quantities are examined; these quantities include the standard deviation and the empirical probability density function of the flutter derivatives at each wind speed. Moreover, the critical flutter speed of the setup is evaluated according to standard flutter theory by accounting for experimental variability. Since the probability distribution of flutter derivatives and critical flutter speed does not seem to obey a standard theoretical model, polynomial chaos expansion is proposed and used to represent the experimental variability.

PIMODS서버에서 분산 비디오스트림의 전송을 위한 상호연결망 (Interconnection Network for Routing Distributed Video Stream on Popularity - Independent Multimedia-on-Demand Server)

  • 임강빈;류문간;신준호;김상중;최경희;정기현
    • 전자공학회논문지C
    • /
    • 제36C권11호
    • /
    • pp.35-45
    • /
    • 1999
  • 본 논문은 멀티미디어 서버의 부하 편중현상을 해결하기 위한 스위치를 제시하고 그 스위치에서의 트래픽 특성 분석을 위한 간단한 확률적 모델을 제안한다. 스위치는 경로설정 방안으로 우회방안을 이용하므로 스위치 내의 트래픽 부하는 우회확률에 커다란 영향을 미친다 본 논문에서는 제안한 모델에 의거하여 스위치의 트래픽 부하에 따르는 우회확률을 추적하였다. 그리고 그 결과를 실험적 결과와 비교함으로써 확률적 모델의 타당성을 검증하였다. 확률적 모델에 의하여 스위치 안으로 유입되는 패킷의 양에 따라 발생하는 우회와 그에 따르는 스위치의 트래픽 포화지점을 예측할 수 있다.

  • PDF

해양 가이드-타워의 고정말뚝에 대한 신뢰도 해석 (Reliability Analysis of Offshore Guyed Tower Against Anchor Pile Failures)

  • 류정선;윤정방;강성후
    • 전산구조공학
    • /
    • 제4권3호
    • /
    • pp.117-127
    • /
    • 1991
  • 해양가이드-타워에 관하여 폭풍 발생시, 계류장치 고정말뚝의 파괴를 주안점으로 한 신뢰도해석 방법에 대하여 연구하였다. 말뚝의 파괴는 최대하중에 대한 것과 반복하중에 대한 것의 두가지 조건을 고려하였다. 최대하중으로 인한 파괴확률은 최초발생확률의 산정방법을 사용하였다. 반면, 반복하중으로 인한 파괴확률은 점토층에 타설된 말뚝에 대한 피로곡선을 바탕으로하여 구하였다. 불규칙파랑에 대한 구조물의 동적해석은 비선형문제의 선형화를 통한 주파수영역 해석으로부터 효율적으로 수행되었다. 수치해석결과, 말뚝지지력의 평균 안전도가 낮고 이의 분산계수가 클수록, 반복하중으로 인한 파괴확률이 최대하중으로 인한 파괴확률과 같은 수준으로 커짐을 알 수 있었다.

  • PDF

NEW RESULTS TO BDD TRUNCATION METHOD FOR EFFICIENT TOP EVENT PROBABILITY CALCULATION

  • Mo, Yuchang;Zhong, Farong;Zhao, Xiangfu;Yang, Quansheng;Cui, Gang
    • Nuclear Engineering and Technology
    • /
    • 제44권7호
    • /
    • pp.755-766
    • /
    • 2012
  • A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since its memory consumption is very high. Recently, in order to solve a large reliability problem within limited computational resources, Jung presented an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. In this paper, it is first identified that Jung's BDD truncation algorithm can be improved for a more practical use. Then, a more efficient truncation algorithm is proposed in this paper, which can generate truncated BDD with smaller size and approximate TEP with smaller truncation error. Empirical results showed this new algorithm uses slightly less running time and slightly more storage usage than Jung's algorithm. It was also found, that designing a truncation algorithm with ideal features for every possible fault tree is very difficult, if not impossible. The so-called ideal features of this paper would be that with the decrease of truncation limits, the size of truncated BDD converges to the size of exact BDD, but should never be larger than exact BDD.

Performance Analysis of Economic VaR Estimation using Risk Neutral Probability Distributions

  • Heo, Se-Jeong;Yeo, Sung-Chil;Kang, Tae-Hun
    • 응용통계연구
    • /
    • 제25권5호
    • /
    • pp.757-773
    • /
    • 2012
  • Traditional value at risk(S-VaR) has a difficulity in predicting the future risk of financial asset prices since S-VaR is a backward looking measure based on the historical data of the underlying asset prices. In order to resolve the deficiency of S-VaR, an economic value at risk(E-VaR) using the risk neutral probability distributions is suggested since E-VaR is a forward looking measure based on the option price data. In this study E-VaR is estimated by assuming the generalized gamma distribution(GGD) as risk neutral density function which is implied in the option. The estimated E-VaR with GGD was compared with E-VaR estimates under the Black-Scholes model, two-lognormal mixture distribution, generalized extreme value distribution and S-VaR estimates under the normal distribution and GARCH(1, 1) model, respectively. The option market data of the KOSPI 200 index are used in order to compare the performances of the above VaR estimates. The results of the empirical analysis show that GGD seems to have a tendency to estimate VaR conservatively; however, GGD is superior to other models in the overall sense.

Identification of the associations between genes and quantitative traits using entropy-based kernel density estimation

  • Yee, Jaeyong;Park, Taesung;Park, Mira
    • Genomics & Informatics
    • /
    • 제20권2호
    • /
    • pp.17.1-17.11
    • /
    • 2022
  • Genetic associations have been quantified using a number of statistical measures. Entropy-based mutual information may be one of the more direct ways of estimating the association, in the sense that it does not depend on the parametrization. For this purpose, both the entropy and conditional entropy of the phenotype distribution should be obtained. Quantitative traits, however, do not usually allow an exact evaluation of entropy. The estimation of entropy needs a probability density function, which can be approximated by kernel density estimation. We have investigated the proper sequence of procedures for combining the kernel density estimation and entropy estimation with a probability density function in order to calculate mutual information. Genotypes and their interactions were constructed to set the conditions for conditional entropy. Extensive simulation data created using three types of generating functions were analyzed using two different kernels as well as two types of multifactor dimensionality reduction and another probability density approximation method called m-spacing. The statistical power in terms of correct detection rates was compared. Using kernels was found to be most useful when the trait distributions were more complex than simple normal or gamma distributions. A full-scale genomic dataset was explored to identify associations using the 2-h oral glucose tolerance test results and γ-glutamyl transpeptidase levels as phenotypes. Clearly distinguishable single-nucleotide polymorphisms (SNPs) and interacting SNP pairs associated with these phenotypes were found and listed with empirical p-values.

고등학교 확률 수업의 '몬티홀 문제' 과제 맥락에서 나타난 논증과정 분석 (An Analysis on Argumentation in the Task Context of 'Monty Hall Problem' at a High School Probability Class)

  • 이윤경;조정수
    • 대한수학교육학회지:학교수학
    • /
    • 제17권3호
    • /
    • pp.423-446
    • /
    • 2015
  • 본 연구의 목적은 고등학교 확률 수업의 '몬티홀 문제' 과제 맥락에서 나타난 논증과 정의 특징을 알아보는 것이다. 고등학교 2학년 상 수준 한 학급의 학생을 대상으로 교사와 학생 사이의 논증과정에 관한 수업담화를 Toulmin의 논증패턴을 이용하여 분석한 결과, 논증 중심의 담화 공동체로 만들기 위한 과제 맥락과 학생들이 질문하고 반박할 수 있는 안전한 교실 문화의 중요성이 밝혀졌다. 또한 복잡한 문제를 함께 해결해 나가는 논증과정을 통해 학생들은 수업에 더 몰입하게 되었으며, 실제적인 경험적 맥락은 개념의 이해를 풍부하게 해 주었다. 그러나 논증과정에서 나타난 추론은 통계적 추론이 아니라 대부분 확률 문제 풀이 위주의 수학적 추론이 나타났다. 이러한 연구 결과는 맥락에 따라 결과를 해석하는 과정에서 학생들의 통계적 추론이 일어남을 교사가 이해할 필요가 있고, 과제 맥락과 질문을 통해 학생들이 논증과정에 적극적으로 참여하도록 해야 한다는 확률 통계 수업에 대한 시사점을 제공할 수 있다.

유가 연계 파생결합증권의 특성에 대한 연구 (A Study on Properties of Crude Oil Based Derivative Linked Security)

  • 손경우;정지영
    • 아태비즈니스연구
    • /
    • 제11권3호
    • /
    • pp.243-260
    • /
    • 2020
  • Purpose - This paper aims to investigate the properties of crude oil based derivative security (DLS) focusing on step-down type for comprehensive understanding of its risk. Design/methodology/approach - Kernel estimation is conducted to figure out statistical feature of the process of oil price. We simulate oil price paths based on kernel estimation results and derive probabilities of hitting the barrier and early redemption. Findings - The amount of issuance for crude oil based DLS is relatively low when base prices are below $40 while it is high when base prices are around $60 or $100, which is not consistent with kernel estimation results showing that oil futures prices tend to revert toward $46.14 and the mean-reverting speed is faster as oil price is lower. The analysis based on simulated oil price paths reveals that probability of early redemption is below 50% for DLS with high base prices and the ratio of the probability of early redemption to the probability of hitting barrier is remarkably low compared to the case for DLS with low base prices, as the chance of early redemption is deferred. Research implications or Originality - Empirical results imply that the level of the base price is a crucial factor of the risk for DLS, thus introducing a time-varying knock-in barrier, which is similar to adjust the base price, merits consideration to enhance protection for DLS investors.

기술력평가에서 사업성수준과 기술성변수간 연관성에 관한 실증연구 (An Empirical Study on the Relationship between Market Feasibility Levels and Technology Variables from Technology Competitiveness Assessment)

  • 성웅현
    • 품질경영학회지
    • /
    • 제32권3호
    • /
    • pp.198-215
    • /
    • 2004
  • Technology competitiveness evaluates environmental and engineered technology and process at both the scientific and market levels. There are increasing concerns to measure the effects of the technology variables on the potential market feasibility levels. However, there are very little empirical analysis studies on that issue. This study investigates the impacts of technology variables on the levels of market feasibility based on 230 data obtained from Korea Technology Transfer Center. As various statistical analysis, the canonical discriminant model, logit discriminant model and classification model were used and their results were compared. This study results showed that major technology variables had very significant relations to discriminate high and low categories of market feasibility. Finally, this study will help building management strategies to level up the potential market performance and also help financial Institutions to decide funds needed for small-sized technology firms.