• Title/Summary/Keyword: probability theory

Search Result 689, Processing Time 0.026 seconds

Reliability-based fragility analysis of nonlinear structures under the actions of random earthquake loads

  • Salimi, Mohammad-Rashid;Yazdani, Azad
    • Structural Engineering and Mechanics
    • /
    • v.66 no.1
    • /
    • pp.75-84
    • /
    • 2018
  • This study presents the reliability-based analysis of nonlinear structures using the analytical fragility curves excited by random earthquake loads. The stochastic method of ground motion simulation is combined with the random vibration theory to compute structural failure probability. The formulation of structural failure probability using random vibration theory, based on only the frequency information of the excitation, provides an important basis for structural analysis in places where there is a lack of sufficient recorded ground motions. The importance of frequency content of ground motions on probability of structural failure is studied for different levels of the nonlinear behavior of structures. The set of simulated ground motion for this study is based on the results of probabilistic seismic hazard analysis. It is demonstrated that the scenario events identified by the seismic risk differ from those obtained by the disaggregation of seismic hazard. The validity of the presented procedure is evaluated by Monte-Carlo simulation.

Teleportation into Quantum Statistics

  • Gill, Richard
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.2
    • /
    • pp.291-325
    • /
    • 2001
  • The paper is a tutorial introduction to quantum information theory, developing the basic model and emphasizing the role of statistics and probability.

  • PDF

Prospect for the Outbreak of War between U.S and China by Comparing of the U.S-Japan Relationship in the World WarII Era and the Modern U.S-China Relationship (태평양 전쟁 전 미일관계와 현재의 미중관계 비교를 통한 미중간 전쟁 발발 가능성에 관한 연구)

  • Kim, Tae-sung
    • Strategy21
    • /
    • s.40
    • /
    • pp.37-81
    • /
    • 2016
  • This paper aims to use crossover analysis to uncover similarities and differences between the U.S-Japan relationship in the World War II era and the modern U.S-China relationship, and to forecast the possibility of the outbreak of war between U.S and China by applying the steps to war theory. The steps to war theory argues that the probability of the outbreak of war between two states within five years would approach 90 percent, if they have ongoing territorial dispute, alliance, rivalry, and arms race. The comparison exposes some similarities with the territorial dispute, alliance, rivalry, but reveals dissimilarities with arms race. U.S-Japan relationship in the World War II era had the arms race, which does not exist the modern U.S-China Relationship. The result of comparison is that the probability for the Outbreak of War between U.S and China correspond to third stage(Risk Level). it means that the probability for the Outbreak of War between U.S and China is 55%. But, There are four elements(① Perception of Leader ② Mutual dependence of economy ③ Possession of nuclear weapon ④ Ravages of war) that reduce the probability for the Outbreak of War. Considering the four elements, the probability for the Outbreak of War between U.S and China is a slim chance. But the probability for the Outbreak of War between U.S and China is excluded because of territorial dispute, alliance, rivalry. So, This paper suggests three points.(① Developing military options ② Reducing the misconception of intend, ③ Promoting navy exchanges) to prevent of Outbreak of War.

Probabilistic Applications for Estimating and Managing Project Contingency (확률이론을 이용한 프로젝트 예비비 산정 및 관리)

  • Lee Man-Hee;Yoo Wi-Sung;Lee Hak-ki
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • 2004.11a
    • /
    • pp.224-227
    • /
    • 2004
  • As a project progresses, it is well known that construction manager has to define the contingency for the expected project cost, which is used as a buffer for uncertainty. In this study, we mention uncertainty as the amount of likelihood that is difficult or impossible to predict project cost. From the completed work package, we obtain the true cost value, and this information is technically good data for estimating the realistic contingency of work packages to be accomplished. Based upon this historical information, construction manager recomputes the contingency for the remaining works. Conditional probability theory is often useful for re-estimating one of the remaining project progress as the true cost of the completed works can be different from the planned cost. As a project is progressing, true value is really important to predict the realistic project budget and to decrease the uncertainty. In this study, we gave applied conditional probability theory to estimating project contingency supposing a project that consists of fire work packages, provide the fundamental framework for setting and controlling project contingency.

  • PDF

The Reliability Estimation of Buried Pipeline Using the FAD and FORM (파손평가선도(FAD)와 FORM을 이용한 매설배관의 건전성 평가)

  • Lee, Ouk-Sub;Kim, Dong-Hyeok
    • Journal of the Korean Society of Safety
    • /
    • v.20 no.4 s.72
    • /
    • pp.20-28
    • /
    • 2005
  • In this paper, the methodology for the reliability estimation of buried pipeline with longitudinal gouges and dent is presented and the limit state of buried pipeline is formulated by failure assessment diagram(FAD). The reliability of buried pipeline with defects has been estimated by using a theory of failure probability. The failure probability is calculated by using the FORM(first order reliability method) and Monte Carlo simulation. The results out of two procedures have been compared each other. It is found that the FORM and Monte Carlo simulation give similar results for varying boundary conditions and various random variables. Furthermore, it is also recognized that the failure probability increases with increasing of dent depth, gouge depth, gouge length, operating pressure, pipe outside radius and decreasing the wall thickness. And it is found that the analysis by using the failure assessment diagram gives highly conservative results than those by using the theory of failure probability.

Estimation of Mean Life and Reliability of Highway Pavement Based on Reliability Theory (신뢰성 개념을 이용한 포장의 평균수명 및 신뢰도 예측)

  • Do, Myung-Sik
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.5D
    • /
    • pp.497-504
    • /
    • 2010
  • In this paper, the author presents a reliability estimation technique to analyze the effects of traffic loads on pavement mean life based on the national highway database of Suwon and Uijeongbu region from 1999 to 2008. The estimation of the mean life, its standard deviation and reliability for pavement sections are calculated by using an appropriate distribution, Lognormal distribution, based on reliability theory. Furthermore, the probability paper method and Maximum likelihood estimation are both used to estimate parameters. The author found that mean life of newly constructed sections and over-layed sections is about 6.5 to 7.9 years and 7.3 to 9.1 years, respectively. The author also ascertained that the results of cumulative failure probability for pavement life between the proposed methods and observed data are similar. Such an assessment methodology and measures based on reliability theory can provide useful information for maintenance plans in pavement management systems as long as additional life data on pavement sections are accumulated.

Rule of Combination Using Expanded Approximation Algorithm (확장된 근사 알고리즘을 이용한 조합 방법)

  • Moon, Won Sik
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.9 no.3
    • /
    • pp.21-30
    • /
    • 2013
  • Powell-Miller theory is a good method to express or treat incorrect information. But it has limitation that requires too much time to apply to actual situation because computational complexity increases in exponential and functional way. Accordingly, there have been several attempts to reduce computational complexity but side effect followed - certainty factor fell. This study suggested expanded Approximation Algorithm. Expanded Approximation Algorithm is a method to consider both smallest supersets and largest subsets to expand basic space into a space including inverse set and to reduce Approximation error. By using expanded Approximation Algorithm suggested in the study, basic probability assignment function value of subsets was alloted and added to basic probability assignment function value of sets related to the subsets. This made subsets newly created become Approximation more efficiently. As a result, it could be known that certain function value which is based on basic probability assignment function is closely near actual optimal result. And certainty in correctness can be obtained while computational complexity could be reduced. by using Algorithm suggested in the study, exact information necessary for a system can be obtained.

Analysis of the margin level in the KOSPI200 futures market (KOSPI200 선물 시장의 증거금 수준에 대한 연구)

  • Kim, Jun;Choe, In-Chan
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2004.05a
    • /
    • pp.734-737
    • /
    • 2004
  • When the margin level is set relatively low, margin violation probability increases and the default probability of the futures market rises. On the other hand, if the margin level is set high, the margin violation probability decreases, but the futures market becomes less attractive to hedgers as the investor's opportunity cost increases. In this paper, we investigate whether the movement of KOSPI200(Korea Composite Stock Price Index 200) futures daily prices can be modeled with the extreme value theory. Base on this investigation, we examine the validity of the margin level set by the extreme value theory. Computational results are presented to compare the extreme value distribution and the empirical distribution of margin violation in KOSPI200. Some observations and implications drawn from the computational experiment are also discussed.

  • PDF

Periodization in the History of Statistics

  • Jo, Jae-Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.1
    • /
    • pp.31-47
    • /
    • 2004
  • The history of statistics from the mid-seventeenth to the early twentieth century is considered and a scheme of periodization is proposed. In the first period(1650-1750), named 'the age of probability' in this paper, concept of probability emerged, and in the second period(1750-1820), named 'the age of error theory', statistical techniques such as the least square method are developed by astronomers and geodesists. Their techniques are supported theoretically by mathematicians like Laplace and Gauss in that period. The third period(1820-1880) is called 'the age of statistics(as a plural noun)' since statistical data played prominent roles in social sciences such as sociology, psychology. Finally the last period(1880- ), called 'the age of statistics(as a singular noun)', the discipline of statistics came to maturity both in theory and application.