• Title/Summary/Keyword: Probabilistic Analysis

Search Result 1,531, Processing Time 0.028 seconds

Prediction of Loss of Life in Downstream due to Dam Break Flood (댐 붕괴 홍수로 인한 하류부 인명피해 예측)

  • Lee, Jae Young;Lee, Jong Seok;Kim, Ki Young
    • Journal of Korea Water Resources Association
    • /
    • v.47 no.10
    • /
    • pp.879-889
    • /
    • 2014
  • In this study, to estimate loss of life considered flood characteristics using the relationship derived from analysis of historical dam break cases and the factors determining loss of life, the loss of life module applying in LIFESim and loss of life estimation by means of a mortality function were suggested and applicability for domestic dam watershed was examined. The flood characteristics, such as water depth, flow velocity and arrival time were simulated by FLDWAV model and flood risk area were predicted by using inundation depth. Based on this, the effects of warning, evacuation and shelter were considered to estimate the number of people exposed to the flood. In order to estimate fatality rates based on the exposed population, flood hazard zone is assigned to three different zones. Then, total fatality numbers were predicted after determining lethality or mortality function for each zone. In the future, the prediction of loss of life due to dam break floods will quantitatively evaluate flood risk and employ to establish flood mitigation measures at downstream applying probabilistic flood scenarios.

Probabilistic Prediction of the Risk of Sexual Crimes Using Weight of Evidence (Weight of Evidence를 활용한 성폭력 범죄 위험의 확률적 예측)

  • KIM, Bo-Eun;KIM, Young-Hoon
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.22 no.4
    • /
    • pp.72-85
    • /
    • 2019
  • The goal of this study is to predict sexual violence crimes, which is an routine risk. The study used to the Weight of Evidence on sexual violence crimes that occurred in partly Cheongju-si for five years from 2011 to 2015. The results are as follows. First, application and analysis of the Weight of Evidence that considers the weight of evidence characteristics showed 8 out of total 26 evidences that are used for a sexual violence crimes risk prediction. The evidences were residential area, date of use permission for building, individual housing price, floor area ratio, number of basement floor, lot area, security light and recreational facility; which satisfied credibility in the process of calculating weight. Second, The weight calculated 8 evidences were combined to create the prediction map in the end. The map showed that 16.5% of sexual violence crimes probability occurs in 0.3㎢, which is 3.3% of the map. The area of probability of 34.5% is 1.8㎢, which is 19.0% of the map and the area of probability of 75.5% is 2.0㎢, which is 20.7% of the map. This study derived the probability of occurrence of sexual violence crime risk and environmental factors or conditions that could reduce it. Such results could be used as basic data for devising preemptive measures to minimize sexual violence, such as police activities to prevent crimes.

Critical Strengthening Ratio of CFRP Plate Using Probability and Reliability Analysis for Concrete Railroad Bridge Strengthened by NSM (확률.신뢰도 기법을 적용한 CFRP 플레이트 표면매립보강 콘크리트 철도교의 임계보강비 산정)

  • Oh, Hong-Seob;Sun, Jong-Wan;Oh, Kwang-Chin;Sim, Jong-Sung;Ju, Min-Kwan
    • Journal of the Korea Concrete Institute
    • /
    • v.21 no.6
    • /
    • pp.681-688
    • /
    • 2009
  • The railroad bridges have been usually experienced by vibration and impact in service state. With this reason, it is important that the effective strengthening capacity should be considered to resist the kind of service loading. In this study, NSM strengthening technique is recommended for the concrete railroad bridge because of its better effective resistance for dynamic loading condition and strengthening cost than the conventional externally bonded strengthening using fiber sheet. However, to widely apply NSM method for the concrete railroad bridge, it needs that the strengthening ratio has to be reasonably evaluated with geometrical and material uncertainties, especially for the concrete bridge under long-term service state without the apparent design history and detail information such as concrete compressive strength, reinforcing ratio, railroad characteristics. The purpose of this study is to propose the critical strengthening ratio of CFRP plate for the targeted concrete railroad bridge with uncertainties of deterioration of the structures. To do this, Monte Carlo Simulation (MCS) for geometrical and material uncertainties have been applied so that this approach may bring the reasonable strengthening ratio of CFRP plate considering probabilistic uncertainties for the targeted concrete railroad bridge. Finally, the critical strengthening ratio of NSM strengthened by CFRP plate is calculated by using the limit state function based on the target reliability index of 3.5.

Prediction of Time to Corrosion for Concrete Bridge Decks Exposed to De-Icing Chemicals (제빙화학제 살포로 인한 콘크리트 교량 바닥판의 철근부식 시작시기의 예측)

  • Lee, Chang-Soo;Yoon, In-Seok;Park, Jong-Hyok
    • Journal of the Korea Concrete Institute
    • /
    • v.15 no.4
    • /
    • pp.606-614
    • /
    • 2003
  • The major cause of deterioration for the concrete bridge decks exposed to de-icing chemicals would be chloride-induced reinforcement corrosion. Thus, in this paper, in order to predict time to corrosion for concrete bridge decks in the urban area, chloride concentration was measured with depth from the surface. A frequency analysis on surface chloride concentration and chloride diffusion coefficient of concrete bridge deck equals 0.192, 29.828 in the scale parameter and 7.899, 1.983 in the shape parameter of gamma distribution. The average value of surface chloride concentration equals 1.5 kg/㎥ and condenses from 1 to 2 kg/㎥ in the level of probability 70%. From the probabilistic results, it is confirmed that 26mm of minimum cover depth in order to target 20 years over is calculated. The countermeasure strategy to extend the service life of concrete bridge deck exposed to de-icing chemicals would be an effective method to increase cover depth and to place high performance concrete, which could lead to reduce the chloride diffusion coefficient and distribution range.

Influence of Modelling Approaches of Diffusion Coefficients on Atmospheric Dispersion Factors (확산계수의 모델링방법이 대기확산인자에 미치는 영향)

  • Hwang, Won Tae;Kim, Eun Han;Jeong, Hae Sun;Jeong, Hyo Joon;Han, Moon Hee
    • Journal of Radiation Protection and Research
    • /
    • v.38 no.2
    • /
    • pp.60-67
    • /
    • 2013
  • A diffusion coefficient is an important parameter in the prediction of atmospheric dispersion using a Gaussian plume model, and its modelling approach varies. In this study, dispersion coefficients recommended by the U. S. Nuclear Regulatory Commission's (U. S. NRC's) regulatory guide and the Canadian Nuclear Safety Commission's (CNSC's) regulatory guide, and used in probabilistic accident consequence analysis codes MACCS and MACCS2 have been investigated. Based on the atmospheric dispersion model for a hypothetical accidental release recommended by the U. S. NRC, its influence to atmospheric dispersion factor was discussed. It was found that diffusion coefficients are basically predicted from a Pasquill- Gifford curve, but various curve fitting equations are recommended or used. A lateral dispersion coefficient is corrected with consideration for the additional spread due to plume meandering in all models, however its modelling approach showed a distinctive difference. Moreover, a vertical dispersion coefficient is corrected with consideration for the additional plume spread due to surface roughness in all models, except for the U. S. NRC's recommendation. For a specified surface roughness, the atmospheric dispersion factors showed differences up to approximately 4 times depending on the modelling approach of a dispersion coefficient. For the same model, the atmospheric dispersion factors showed differences by 2 to 3 times depending on surface roughness.

A Framework of Recognition and Tracking for Underwater Objects based on Sonar Images : Part 2. Design and Implementation of Realtime Framework using Probabilistic Candidate Selection (소나 영상 기반의 수중 물체 인식과 추종을 위한 구조 : Part 2. 확률적 후보 선택을 통한 실시간 프레임워크의 설계 및 구현)

  • Lee, Yeongjun;Kim, Tae Gyun;Lee, Jihong;Choi, Hyun-Taek
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.3
    • /
    • pp.164-173
    • /
    • 2014
  • In underwater robotics, vision would be a key element for recognition in underwater environments. However, due to turbidity an underwater optical camera is rarely available. An underwater imaging sonar, as an alternative, delivers low quality sonar images which are not stable and accurate enough to find out natural objects by image processing. For this, artificial landmarks based on the characteristics of ultrasonic waves and their recognition method by a shape matrix transformation were proposed and were proven in Part 1. But, this is not working properly in undulating and dynamically noisy sea-bottom. To solve this, we propose a framework providing a selection phase of likelihood candidates, a selection phase for final candidates, recognition phase and tracking phase in sequence images, where a particle filter based selection mechanism to eliminate fake candidates and a mean shift based tracking algorithm are also proposed. All 4 steps are running in parallel and real-time processing. The proposed framework is flexible to add and to modify internal algorithms. A pool test and sea trial are carried out to prove the performance, and detail analysis of experimental results are done. Information is obtained from tracking phase such as relative distance, bearing will be expected to be used for control and navigation of underwater robots.

A Design and Implementation of Reliability Analyzer for Embedded Software using Markov Chain Model and Unit Testing (내장형 소프트웨어 마르코프 체인 모델과 단위 테스트를 이용한 내장형 소프트웨어 신뢰도 분석 도구의 설계와 구현)

  • Kwak, Dong-Gyu;Yoo, Chae-Woo;Choi, Jae-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.12
    • /
    • pp.1-10
    • /
    • 2011
  • As requirements of embedded system get complicated, the tool for analyzing the reliability of embedded software is being needed. A probabilistic modeling is used as the way of analyzing the reliability of a software and to apply it to embedded software controlling multiple devices. So, it is necessary to specialize that to embedded software. Also, existing reliability analyzers should measure the transition probability of each condition in different ways and doesn't consider reusing the model once used. In this paper, we suggest a reliability analyzer for embedded software using embedded software Markov chin model and a unit testing tool. Embedded software Markov chain model is model specializing Markov chain model which is used for analyzing reliability to an embedded software. And a unit testing tool has host-target structure which is appropriate to development environment of embedded software. This tool can analyze the reliability more easily than existing tool by automatically measuring the transition probability between units for analyzing reliability from the result of unit testing. It can also directly apply the test result updated by unit testing tool by representing software model as a XML oriented document and has the advantage that many developers can access easily using the web oriented interface and SVN store. In this paper, we show reliability analyzing of a example by so doing show usefulness of reliability analyzer.

Analysis of the Mean and Standard Deviation due to the Change of the Probability Density Function on Tidal Elevation Data (조위의 확률밀도함수 변화에 따른 평균 및 표준편차 분석)

  • Cho, Hong-Yeon;Jeong, Shin-Taek;Lee, Khil-Ha;Kim, Tae-Heon
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.22 no.4
    • /
    • pp.279-285
    • /
    • 2010
  • In the process of the probabilistic-based design on the coastal structures, the probability density function (pdf) of tidal elevation data is assumed as the normal distribution function. The pdf shape of tidal elevation data, however, is better-fitted to the double-peak normal distribution function and the equivalent mean and standard deviation (SD) estimation process based on the equivalent normal distribution is required. The equivalent mean and SD (equivalent parameters) are different with the mean and SD (normal parameters) estimated in the condition that the pdf of tidal elevation is normal distribution. In this study, the difference, i.e., estimation error, between equivalent parameters and normal parameters is compared and analysed. The difference is increased as the tidal elevation and its range are increased. The mean and SD differences in the condition of the tidal elevation is ${\pm}400cm$ are above 100 cm and about 80~100 cm, respectively, in Incheon station. Whereas, the mean and SD differences in the condition of the tidal elevation is ${\pm}60cm$ are very small values in the range of 2~4 cm, in Pohang station.

A Three-Dimensiomal Slope Stability Analysis in Probabilistic Solution (3차원(次元) 사면(斜面) 안정해석(安定解析)에 관한 확률론적(確率論的) 연구(研究))

  • Kim, Young Su
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.4 no.3
    • /
    • pp.75-83
    • /
    • 1984
  • The probability of failure is used to analyze the reliability of three dimensional slope failure, instead of conventional factor of safety. The strength parameters are assumed to be normal variated and beta variated. These are interval estimated under the specified confidence level and maximum likelihood estimation. The pseudonormal and beta random variables are generated using the uniform probability transformation method according to central limit theorem and rejection method. By means of a Monte-Carlo Simulation, the probability of failure is defined as; $P_f=M/N$ N: Total number of trials M: Total number of failures Some of the conclusions derived. from the case study include; 1. Three dimensional factors of safety are generally much higher than 2-D factors of safety. However situations appear to exist where the 3-D factor of safety can be lower than the 2-D factor of safety. 2. The $F_3/F_2$ ratio appears to be quite sensitive to c and ${\phi}$ and to the shape of the 3-D shear surface and the slope but not to be to the unit weight of soil. 3. From the two models (normal, beta) considered for the distribution of the factor of safety, the beta distribution generally provides lager than normal distribution. 4. Results obtained using the beta and normal models are presented in a nomgraph relating slope height and slop angle to probability of failure.

  • PDF

Robust Design Method for Complex Stochastic Inventory Model

  • Hwang, In-Keuk;Park, Dong-Jin
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1999.04a
    • /
    • pp.426-426
    • /
    • 1999
  • ;There are many sources of uncertainty in a typical production and inventory system. There is uncertainty as to how many items customers will demand during the next day, week, month, or year. There is uncertainty about delivery times of the product. Uncertainty exacts a toll from management in a variety of ways. A spurt in a demand or a delay in production may lead to stockouts, with the potential for lost revenue and customer dissatisfaction. Firms typically hold inventory to provide protection against uncertainty. A cushion of inventory on hand allows management to face unexpected demands or delays in delivery with a reduced chance of incurring a stockout. The proposed strategies are used for the design of a probabilistic inventory system. In the traditional approach to the design of an inventory system, the goal is to find the best setting of various inventory control policy parameters such as the re-order level, review period, order quantity, etc. which would minimize the total inventory cost. The goals of the analysis need to be defined, so that robustness becomes an important design criterion. Moreover, one has to conceptualize and identify appropriate noise variables. There are two main goals for the inventory policy design. One is to minimize the average inventory cost and the stockouts. The other is to the variability for the average inventory cost and the stockouts The total average inventory cost is the sum of three components: the ordering cost, the holding cost, and the shortage costs. The shortage costs include the cost of the lost sales, cost of loss of goodwill, cost of customer dissatisfaction, etc. The noise factors for this design problem are identified to be: the mean demand rate and the mean lead time. Both the demand and the lead time are assumed to be normal random variables. Thus robustness for this inventory system is interpreted as insensitivity of the average inventory cost and the stockout to uncontrollable fluctuations in the mean demand rate and mean lead time. To make this inventory system for robustness, the concept of utility theory will be used. Utility theory is an analytical method for making a decision concerning an action to take, given a set of multiple criteria upon which the decision is to be based. Utility theory is appropriate for design having different scale such as demand rate and lead time since utility theory represents different scale across decision making attributes with zero to one ranks, higher preference modeled with a higher rank. Using utility theory, three design strategies, such as distance strategy, response strategy, and priority-based strategy. for the robust inventory system will be developed.loped.

  • PDF