• Title/Summary/Keyword: Rare events

Search Result 164, Processing Time 0.027 seconds

Rare Disaster Events, Growth Volatility, and Financial Liberalization: International Evidence

  • Bongseok Choi
    • Journal of Korea Trade
    • /
    • v.27 no.2
    • /
    • pp.96-114
    • /
    • 2023
  • Purpose - This paper elucidates a nexus between the occurrence of rare disaster events and the volatility of economic growth by distinguishing the likelihood of rare events from stochastic volatility. We provide new empirical facts based on a quarterly time series. In particular, we focus on the role of financial liberalization in spreading the economic crisis in developing countries. Design/methodology - We use quarterly data on consumption expenditure (real per capita consumption) from 44 countries, including advanced and developing countries, ending in the fourth quarter of 2020. We estimate the likelihood of rare event occurrences and stochastic volatility for countries using the Bayesian Markov chain Monte Carlo (MCMC) method developed by Barro and Jin (2021). We present our estimation results for the relationship between rare disaster events, stochastic volatility, and growth volatility. Findings - We find the global common disaster event, the COVID-19 pandemic, and thirteen country-specific disaster events. Consumption falls by about 7% on average in the first quarter of a disaster and by 4% in the long run. The occurrence of rare disaster events and the volatility of gross domestic product (GDP) growth are positively correlated (4.8%), whereas the rare events and GDP growth rate are negatively correlated (-12.1%). In particular, financial liberalization has played an important role in exacerbating the adverse impact of both rare disasters and financial market instability on growth volatility. Several case studies, including the case of South Korea, provide insights into the cause of major financial crises in small open developing countries, including the Asian currency crisis of 1998. Originality/value - This paper presents new empirical facts on the relationship between the occurrence of rare disaster events (or stochastic volatility) and growth volatility. Increasing data frequency allows for greater accuracy in assessing a country's specific risk. Our findings suggest that financial market and institutional stability can be vital for buffering against rare disaster shocks. It is necessary to preemptively strengthen the foundation for financial stability in developing countries and increase the quality of the information provided to markets.

Evaluating Interval Estimates for Comparing Two Proportions with Rare Events

  • Park, Jin-Kyung;Kim, Yong-Dai;Lee, Hak-Bae
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.3
    • /
    • pp.435-446
    • /
    • 2012
  • Epidemiologic studies frequently try to estimate the impact of a specific risk factor. The risk difference and the risk ratio are generally useful measurements for this purpose. When using such measurements for rare events, the standard approaches based on the normal approximation may fail, in particular when no events are observed. In this paper, we discuss and evaluate several existing methods to construct confidence intervals around risk differences and risk ratios using Monte-Carlo simulations when the disease of interest is rare. The results in this paper provide guidance how to construct interval estimates of the risk differences and the risk ratios when no events are detected.

A Simple Approach to Calculate CDF with Non-rare Events in Seismic PSA Model of Korean Nuclear Power Plants (국내 원자력발전소 지진 PSA의 CDF 과평가 방지를 위한 비희귀사건 모델링 방법 연구)

  • Lim, Hak Kyu
    • Journal of the Korean Society of Safety
    • /
    • v.36 no.5
    • /
    • pp.86-91
    • /
    • 2021
  • Calculating the scrutable core damage frequency (CDF) of nuclear power plants is an important component of the seismic probabilistic safety assessment (SPSA). In this work, a simple approach is developed to calculate CDF from minimal cut sets (MCSs) with non-rare events. When conventional calculation methods based on rare event approximations are employed, the CDF of industry SPSA models is significantly overestimated by non-rare events in the MCSs. Recently, quantification algorithms using binary decision diagrams (BDDs) have been introduced to prevent CDF overestimation in the SPSA. However, BDD structures are generated from a small part of whole MCSs due to limited computational memory, and they cannot be reviewed due to their complicated logic structure. This study suggests a simple approach for scrutinizing the CDF calculation based on whole MCSs in the SPSA system analysis model. The proposed approach compares the new results to outputs from existing algorithms, which helps in avoiding CDF overestimation.

Geographical Visualization of Rare Events

  • Roh, Hye-Jung;Jeong, Jae-Joon
    • Proceedings of the KSRS Conference
    • /
    • 2007.10a
    • /
    • pp.434-437
    • /
    • 2007
  • Maps contain and effectively visualize a number of spatial information. Advances in GIS enable researchers to analyze and represent spatial information through digital maps. Choropleth maps represent different quantities showing usually rates, percentages or densities. Generally, researchers make choropleth maps using raw rates. But, if the events are rare, raw rates cannot be sufficient in representing spatial phenomena. That is to say, if the population is large and events are rare, we cannot be sure that the raw rate is correct. The objective of this study is to make choropleth maps by several rate calculation methods and compare them. We use three methods in choropleth mapping; a raw rate, empirical Bayesian method, and spatial rate method which use prior probabilities. The experiments reveal that maps are somewhat different by used methods. We suggest that a raw rate method can not be an only way to make a rate map and researchers should choose an appropriate method for their objectives.

  • PDF

Use of the t-Distribution to Construct Seismic Hazard Curves for Seismic Probabilistic Safety Assessments

  • Yee, Eric
    • Nuclear Engineering and Technology
    • /
    • v.49 no.2
    • /
    • pp.373-379
    • /
    • 2017
  • Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

Comparison Of Interval Estimation For Relative Risk Ratio With Rare Events

  • Kim, Yong Dai;Park, Jin-Kyung
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.1
    • /
    • pp.181-187
    • /
    • 2004
  • One of objectives in epidemiologic studies is to detect the amount of change caused by a specific risk factor. Risk ratio is one of the most useful measurements in epidemiology. When we perform the inference for this measurement with rare events, the standard approach based on the normal approximation may fail, in particular when there are no disease cases observed. In this paper, we discuss and evaluate several existing methods for constructing a confidence interval of risk ratio through simulation when the disease of interest is a rare event. The results in this paper provide guidance with how to construct interval estimates for risk difference and risk ratio when there are no disease cases observed.

On the Interval Estimation of the Difference between Independent Proportions with Rare Events

  • im, Yongdai;Choi, Daewoo
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.481-487
    • /
    • 2000
  • When we construct an interval estimate of two independent proportions with rare events, the standard approach based on the normal approximation behaves badly in many cases. The problem becomes more severe when no success observations are observed on both groups. In this paper, we compare two alternative methods of constructing a confidence interval of the difference of two independent proportions by use of simulation. One is based on the profile likelihood and the other is the Bayesian probability interval. It is shown in this paper that the Bayesian interval estimator is easy to be implemented and performs almost identical to the best frequentist's method -the profile likelihood approach.

  • PDF

A Study on Human Error Countermeasures considering Hazardous Situational Context among Organizational Factors in NPP (원전에서 조직 위험요소의 상황적 맥락을 고려한 인적오류 관리방안 제고)

  • Luo, Meiling;Kim, Sa-Kil;Lee, Yong-Hee
    • Journal of the Korean Society of Safety
    • /
    • v.30 no.1
    • /
    • pp.87-93
    • /
    • 2015
  • Most incidents and accidents involved human during operating NPPs have a tendency to be structured by complicated and various organizational, individual, and environmental factors. The salient feature of the human error in NPP was extremely low frequency, extremely high complicated and extremely serious damage of human life and property. Our research team defined as 'rare human errors'. To prevent the rare human errors, the most researchers and analysts insist invariably that the root causes be made clear. The making them clear, however, is difficult because their root causes are very various and uncertain. However, These tools have limits that they do not adapt all operating situations and circumstances such as design base events. The purpose of this study is to improve the rare human error hazards consider the situational contex. Through this challenging try based on evidences to the human errors could be useful to prevent rare and critical events can occur in the future.

Probability subtraction method for accurate quantification of seismic multi-unit probabilistic safety assessment

  • Park, Seong Kyu;Jung, Woo Sik
    • Nuclear Engineering and Technology
    • /
    • v.53 no.4
    • /
    • pp.1146-1156
    • /
    • 2021
  • Single-unit probabilistic safety assessment (SUPSA) has complex Boolean logic equations for accident sequences. Multi-unit probabilistic safety assessment (MUPSA) model is developed by revising and combining SUPSA models in order to reflect plant state combinations (PSCs). These PSCs represent combinations of core damage and non-core damage states of nuclear power plants (NPPs). Since all these Boolean logic equations have complemented gates (not gates), it is not easy to generate exact Boolean solutions. Delete-term approximation method (DTAM) has been widely applied for generating approximate minimal cut sets (MCSs) from the complex Boolean logic equations with complemented gates. By applying DTAM, approximate conditional core damage probability (CCDP) has been calculated in SUPSA and MUPSA. It was found that CCDP calculated by DTAM was overestimated when complemented gates have non-rare events. Especially, the CCDP overestimation drastically increases if seismic SUPSA or MUPSA has complemented gates with many non-rare events. The objective of this study is to suggest a new quantification method named probability subtraction method (PSM) that replaces DTAM. The PSM calculates accurate CCDP even when SUPSA or MUPSA has complemented gates with many non-rare events. In this paper, the PSM is explained, and the accuracy of the PSM is validated by its applications to a few MUPSAs.

A Study on the Power Comparison between Logistic Regression and Offset Poisson Regression for Binary Data

  • Kim, Dae-Youb;Park, Heung-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.4
    • /
    • pp.537-546
    • /
    • 2012
  • In this paper, for analyzing binary data, Poisson regression with offset and logistic regression are compared with respect to the power via simulations. Poisson distribution can be used as an approximation of binomial distribution when n is large and p is small; however, we investigate if the same conditions can be held for the power of significant tests between logistic regression and offset poisson regression. The result is that when offset size is large for rare events offset poisson regression has a similar power to logistic regression, but it has an acceptable power even with a moderate prevalence rate. However, with a small offset size (< 10), offset poisson regression should be used with caution for rare events or common events. These results would be good guidelines for users who want to use offset poisson regression models for binary data.