• Title/Summary/Keyword: 확률적 접근법

Search Result 141, Processing Time 0.028 seconds

Construction of Logic Trees and Hazard Curves for Probabilistic Tsunami Hazard Analysis (확률론적 지진해일 재해도평가를 위한 로직트리 작성 및 재해곡선 산출 방법)

  • Jho, Myeong Hwan;Kim, Gun Hyeong;Yoon, Sung Bum
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.31 no.2
    • /
    • pp.62-72
    • /
    • 2019
  • Due to the difficulties in forecasting the intensity and the source location of tsunami the countermeasures prepared based on the deterministic approach fail to work properly. Thus, there is an increasing demand of the tsunami hazard analyses that consider the uncertainties of tsunami behavior in probabilistic approach. In this paper a fundamental study is conducted to perform the probabilistic tsunami hazard analysis (PTHA) for the tsunamis that caused the disaster to the east coast of Korea. A logic tree approach is employed to consider the uncertainties of the initial free surface displacement and the tsunami height distribution along the coast. The branches of the logic tree are constructed by reflecting characteristics of tsunamis that have attacked the east coast of Korea. The computational time is nonlinearly increasing if the number of branches increases in the process of extracting the fractile curves. Thus, an improved method valid even for the case of a huge number of branches is proposed to save the computational time. The performance of the discrete weight distribution method proposed first in this study is compared with those of the conventional sorting method and the Monte Carlo method. The present method is comparable to the conventional methods in its accuracy, and is efficient in the sense of computational time when compared with the conventional sorting method. The Monte Carlo method, however, is more efficient than the other two methods if the number of branches and the number of fault segments increase significantly.

Estimation of Mean Life and Reliability of Highway Pavement Based on Reliability Theory (신뢰성 개념을 이용한 포장의 평균수명 및 신뢰도 예측)

  • Do, Myung-Sik
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.5D
    • /
    • pp.497-504
    • /
    • 2010
  • In this paper, the author presents a reliability estimation technique to analyze the effects of traffic loads on pavement mean life based on the national highway database of Suwon and Uijeongbu region from 1999 to 2008. The estimation of the mean life, its standard deviation and reliability for pavement sections are calculated by using an appropriate distribution, Lognormal distribution, based on reliability theory. Furthermore, the probability paper method and Maximum likelihood estimation are both used to estimate parameters. The author found that mean life of newly constructed sections and over-layed sections is about 6.5 to 7.9 years and 7.3 to 9.1 years, respectively. The author also ascertained that the results of cumulative failure probability for pavement life between the proposed methods and observed data are similar. Such an assessment methodology and measures based on reliability theory can provide useful information for maintenance plans in pavement management systems as long as additional life data on pavement sections are accumulated.

Application of a large-scale ensemble climate simulation database for estimating the extreme rainfall (극한강우량 산정을 위한 대규모 기후 앙상블 모의자료의 적용)

  • Kim, Youngkyu;Son, Minwoo
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.3
    • /
    • pp.177-189
    • /
    • 2022
  • The purpose of this study is to apply the d4PDF (Data for Policy Decision Making for Future Change) constructed from a large-scale ensemble climate simulation to estimate the probable rainfall with low frequency and high intensity. In addition, this study analyzes the uncertainty caused by the application of the frequency analysis by comparing the probable rainfall estimated using the d4PDF with that estimated using the observed data and frequency analysis at Geunsam, Imsil, Jeonju, and Jangsu stations. The d4PDF data consists of a total of 50 ensembles, and one ensemble provides climate and weather data for 60 years such as rainfall and temperature. Thus, it was possible to collect 3,000 annual maximum daily rainfall for each station. By using these characteristics, this study does not apply the frequency analysis for estimating the probability rainfall, and we estimated the probability rainfall with a return period of 10 to 1000 years by distributing 3,000 rainfall by the magnitude based on a non-parametric approach. Then, the estimated probability rainfall using d4PDF was compared with those estimated using the Gumbel or GEV distribution and the observed rainfall, and the deviation between two probability rainfall was estimated. As a result, this deviation increased as the difference between the return period and the observation period increased. Meanwhile, the d4PDF reasonably suggested the probability rainfall with a low frequency and high intensity by minimizing the uncertainty occurred by applying the frequency analysis and the observed data with the short data period.

Bayesian Reliability Analysis Using Kriging Dimension Reduction Method(KDRM) (크리깅 기반 차원감소법을 이용한 베이지안 신뢰도 해석)

  • An, Da-Un;Choi, Joo-Ho;Won, Jun-Ho
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.21 no.3
    • /
    • pp.275-280
    • /
    • 2008
  • A technique for reliability-based design optimization(RBDO) is developed based on the Bayesian approach, which can deal with the epistemic uncertainty arising due to the limited number of data. Until recently, the conventional REDO was implemented mostly by assuming the uncertainty as aleatory which means the statistical properties are completely known. In practice, however, this is not the case due to the insufficient data for estimating the statistical information, which makes the existing RBDO methods less useful. In this study, a Bayesian reliability is introduced to take account of the epistemic uncertainty, which is defined as the lower confidence bound of the probability distribution of the original reliability. In this case, the Bayesian reliability requires double loop of the conventional reliability analyses, which can be computationally expensive. Kriging based dimension reduction method(KDRM), which is a new efficient tool for the reliability analysis, is employed to this end. The proposed method is illustrated using a couple of numerical examples.

Comparative Study of Reliability Design Methods by Application to Donghae Harbor Breakwaters. 2. Sliding of Caissons (동해항 방파제를 대상으로 한 신뢰성 설계법의 비교 연구. 2. 케이슨의 활동)

  • Kim, Seung-Woo;Suh, Kyung-Duck;Oh, Young-Min
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.18 no.2
    • /
    • pp.137-146
    • /
    • 2006
  • This is the second of a two-part paper which describes comparison of reliability design methods by application to Donghae Harbor Breakwaters. In this paper, Part 2, we deal with sliding of caissons. The failure modes of a vertical breakwater, which consists of a caisson mounted on a rubble mound, include the sliding and overturning of the caisson and the failure of the rubble mound or subsoil, among which most frequently occurs the sliding of the caisson. The traditional deterministic design method for sliding failure of a caisson uses the concept of a safety factor that the resistance should be greater than the load by a certain factor (e.g. 1.2). However, the safety of a structure cannot be quantitatively evaluated by the concept of a safety factor. On the other hand, the reliability design method, for which active research is being performed recently, enables one to quantitatively evaluate the safety of a structure by calculating the probability of failure of the structure. The reliability design method is classified into three categories depending on the level of probabilistic concepts being employed, i.e., Level 1, 2, and 3. In this study, we apply the reliability design methods to the sliding of the caisson of the breakwaters of Donghae Harbor, which was constructed by traditional deterministic design methods to be damaged in 1987. Analyses are made for the breakwaters before the damage and after reinforcement. The probability of failure before the damage is much higher than the allowable value, indicating that the breakwater was under-designed. The probability of failure after reinforcement, however, is close to the allowable value, indicating that the breakwater is no longer in danger. On the other hand, the results of the different reliability design methods are in fairly good agreement, confirming that there is not much difference among different methods.

Hierarchical Event Detection for Low-Energy Operation In Video Surveillance Embedded System (영상 감시용 임베디드 시스템에서의 저에너지 동작을 위한 계층적 사건 탐지)

  • Kim, Tae-Rim;Kim, Bum-Soo;Kim, Dae-Joon;Kim, Geon-Su
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.12 no.3
    • /
    • pp.204-211
    • /
    • 2011
  • Embedded systems require intensively complex and high power dissipating modules that have the capability of real-time high performance data processing, wide bandwidth communication, and low power consumption. However, the current battery technology has not been developed as much as meeting the requirements of portable embedded systems for long system lifetime. In this paper, new approach that operates with low energy consumption is proposed to overcome the situation while guaranteeing detection accuracy. The designed method associates a variety of detection algorithms hierarchically to detect events happening around the system. The Change for energy consumption characteristics is shown with change for probabilistic characteristics and those relationships are analytically explained from experiments. Furthermore, several techniques to consume low energy and achieve high detection accuracy are described, depending on the event static or dynamic characteristics.

On asymptotics for a bias-corrected version of the NPMLE of the probability of discovering a new species (신종발견확률의 편의보정 비모수 최우추정량에 관한 연구)

  • 이주호
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.2
    • /
    • pp.341-353
    • /
    • 1993
  • As an estimator of the conditional probability of discovering a new species at the next observation after a sample of certain size is taken, the one proposed by Good(1953) has been most widely used. Recently, Clayton and Frees(1987) showed via simulation that their nonparametric maximum likelihood estimator(NPMLE) has smaller MSE than Good's estimator when the population is relatively nonuniform. Lee(1989) proved that their conjecture is asymptotically true for truncated geometric population distributions. One shortcoming of the NPMLE, however, is that it has a considerable amount of negative bias. In this study we proposed a bias-corrected version of the NPMLE for virtually all realistic population distributions. We also showed that it has a smaller asymptotic MSE than Good's extimator except when the population is very uniform. A Monte Carlo simulation was performed for small sample sizes, and the result supports the asymptotic results.

  • PDF

The assessment of performances of regional frequency models using Monte Carlo simulation: Index flood method and artificial neural network model (몬테카를로 시뮬레이션을 이용한 지역빈도해석 기법의 성능 분석: 홍수지수법과 인공신경망 모델)

  • Lee, Joohyung;Seo, Miru;Park, Jaeheyon;Heo, Jun-Haeng
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2021.06a
    • /
    • pp.156-156
    • /
    • 2021
  • 본 연구는 지역빈도해석을 기반으로한 인공신경망 모델과 기존에 널리 사용되는 방법인 홍수지수법의 성능을 몬테카를로 시뮬레이션을 이용하여 평가하였다. 컴퓨터 기술이 발달함에 따라 인공지능에 대한 접근성이 좋아지며 수문학을 포함한 다양한 분야에 적용되고 있다. 인공지능을 이용하여 강수량 및 유량 등 다양한 수문자료에 대한 예측이 이루어지고 있으나 빈도해석에 관한 연구는 비교적 적다. 본 연구에서 사용된 인공 지능 모델은 대상 지점의 지형학적 자료와 수문학적 자료를 이용하여 인공신경망을 통해 지점의 확률강우량(QRT-ANN) 및 확률분포형의 매개변수 (PRT-ANN)를 추정한다. 지형학적 자료로는 위도, 경도 그리고 고도가 사용되었으며 수문학적 자료로는 대상 지점의 최근 30년 일일연최대강우량을 사용하였다. 지역빈도해석의 정확도는 지역 내 통계적 특성이 비슷한 지점들이 포함되면 될수록 높아진다. 통계적 특성으로는 불일치 척도, 이질성 척도, 적합성 척도가 있으며 다양한 조건의 통계적 특성에 따른 세 개의 지역빈도해석 방법의 성능을 평가하고자 하였다. 대상 지역 내 n개의 지점이 있다고 가정하였을 때, 홍수지수법의 경우 n-1개의 지점으로 추정한 지역 성장곡선을 이용하여 나머지 1개 지점의 확률강우량을 산정할 수 있으며 인공신경망 모델들 또한 n-1개 지점들의 자료를 이용하여 모델을 구축한 뒤 나머지 지점의 확률강우량 및 확률분포형의 매개변수를 예측할 수 있다. PRT-ANN의 경우 예측된 매개변수를 이용하여 확률강우량을 산정하며 시뮬레이션 시행마다 발생시킨 자료의 지점빈도해석 결과에 대한 나머지 세 방법의 평균 제곱근 상대오차 (Relative root mean square error, RRMSE)를 계산하였다. 몬테카를로 시뮬레이션을 이용한 성능 분석을 통하여 관측값의 다양한 통계적 특성에 맞는 지역빈도해석 방법을 제시할 수 있을 것으로 판단된다.

  • PDF

Human Exposure Assessment of Pesticide Residues in Cattle by-product Fed the Rice Straw (농약이 잔류된 볏짚조사료을 급여한 소의 부산물 섭취에 따른 인체노출평가)

  • Gil, Geun-Hwan;Paik, Min-Kyoung;Kim, Jin-Bae;Kim, Chan-Sub;Son, Kyung-Ae;Im, Geon-Jae;Ihm, Yang-Bin;Lee, Kyu-Seung
    • The Korean Journal of Pesticide Science
    • /
    • v.17 no.4
    • /
    • pp.249-255
    • /
    • 2013
  • The objective of this study was to investigate the exposure assessment of Korean consumers to edifenphos and tricyclazole in cattle product fed the rice straw, using a probabilistic approach. We used tricyclazole and edifenphos residue data in rice straw reported by National Academy of Agricultural Science (NAAS) for the 1998, 1999, 2001 and 2010 monitoring study and National Agricultural Products Quality Management Service (NAQS) for 2009 monitoring study. The mean exposures of edifenphos and tricyclazole for all of Korean consumers were 0.027% and 0.0006% of ADI and $99%^{th}$ percentile exposures were 0.034% and 0.0007% of ADI respectively. The group of 1~6 years old consumers has the lowest exposure of edifenphos and tricyclazole. The group of 19~29 years old consumers has the highest exposure of edifenphos and tricyclazole.

A Study of Statistical Analysis of Rock Joint Directional Data (암반 절리 방향성 자료의 통계적 분석 기법에 관한 연구)

  • 류동우;김영민;이희근
    • Tunnel and Underground Space
    • /
    • v.12 no.1
    • /
    • pp.19-30
    • /
    • 2002
  • Rock joint orientation is one of important geometric attributes that have an influence on the stability of rock structures such as rock slopes and tunnels. Especially, statistical models of the geometric attributes of rock joints can provide a probabilistic approach of rock engineering problems. The result from probabilistic modeling relies on the choice of statistical model. Therefore, it is critical to define a representative statistical model for joint orientation data as well as joint size and intensity and build up a series of modeling procedure including analytical validation. In this paper, we have examined a theoretical methodology for the statistical estimate and hypothesis analysis based upon Fisher distribution and bivariate normal distribution. In addition, we have proposed the algorithms of random number generator which is applied to the simulation of rock joint networks and risk analysis.