• Title/Summary/Keyword: probability theory

Search Result 689, Processing Time 0.031 seconds

확률의 상관 빈도이론과 포퍼

  • Song, Ha-Seok
    • Korean Journal of Logic
    • /
    • v.8 no.1
    • /
    • pp.23-46
    • /
    • 2005
  • The purpose of the paper Is to discuss and estimate early Popper's theory of probability, which is presented in his book, The Logic of of Scientific Discovery. For this, Von Mises' frequency theory shall be discussed in detail, which is regarded as the most systematic and sophisticated frequency theory among others. Von Mises developed his theory to response to various critical questions such as how finite and empirical collectives can be represented in terms of infinite and mathematical collectives, and how the axiom of randomness can be mathematically formulated. But his theory still has another difficulty, which is concerned with the inconsistency between the axiom of convergence and the axiom of randomness. Defending the objective theory of probability, Popper tries to present his own frequency theory, solving the difficulty. He suggests that the axiom of convergence be given up and that the axiom of randomness be modified to solve Von Mises' problem. That is, Popper introduces the notion of ordinal selection and neighborhood selection to modify the axiom of randomness. He then shows that Bernoulli's theorem is derived from the modified axiom. Consequently, it can be said that Popper solves the problem of inconsistency which is regarded as crucial to Von Mises' theory. However, Popper's suggestion has not drawn much attention. I think it is because his theory seems anti-intuitive in the sense that it gives up the axiom of convergence which is the basis of the frequency theory So for more persuasive frequency theory, it is necessary to formulate the axiom of randomness to be consistent with the axiom of convergence.

  • PDF

Using Simulation for a Didactic Transposition of Probability (시뮬레이션을 활용한 확률 지식의 교수학적 변환)

  • Shin, Bo-Mi;Lee, Kyung-Hwa
    • Journal of Educational Research in Mathematics
    • /
    • v.18 no.1
    • /
    • pp.25-50
    • /
    • 2008
  • Several previous studies suggested that simulation could be a main didactic instrument in overcoming misconception and probability modeling. However, they have not described enough how to reorganize probability knowledge as knowledge to be taught in a curriculum using simulation. The purpose of this study is to identify the theoretical knowledge needed in developing a didactic transposition method of probability knowledge using simulation. The theoretical knowledge needed to develop this method was specified as follows : pseudo-contextualization/pseudo-personalization, and pseudo-decontextualization/pseudo-deper-sonalization according to the introductory purposes of simulation. As a result, this study developed a local instruction theory and an hypothetical learning trajectory for overcoming misconceptions and modeling situations respectively. This study summed up educational intention, which was designed to transform probability knowledge into didactic according to the introductory purposes of simulation, into curriculum, lesson plans, and experimental teaching materials to present didactic ideas for new probability education programs in the high school probability curriculum.

  • PDF

Failure Probability Evaluation of Pressure Tube using the Probabilistic Fracture Mechanics (확률론적 파괴역학 기법을 이용한 압력관의 파손확률 평가)

  • Son, Jong-Dong;Oh, Dong-Joon
    • Journal of the Korean Society of Safety
    • /
    • v.22 no.4
    • /
    • pp.7-12
    • /
    • 2007
  • In order to evaluate the integrity of Zr-2.5Nb pressure tubes, probabilistic fracture mechanics(PFM) approach was employed. Failure assessment diagram(FAD), plastic collapses, and critical crack lengths(CCL) were used for evaluating the failure probability as failure criteria. The Kr-FAD as failure assessment diagram was used because fracture of pressure tubes occurred in brittle manner due to hydrogen embrittlement of material by deuterium fluence. The probabilistic integrity evaluation observed AECL procedures and used fracture toughness parameters of EPRI and recently announced theory. In conclusion, the probabilistic approach using the Kr-FAD made it possible to determine major failure criterion in the pressure tube integrity evaluation.

Comparison of Some Nonparametric Statistical Inference for Logit Model (로짓모형의 비모수적 추론의 비교)

  • 정형철;김대학
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.2
    • /
    • pp.355-366
    • /
    • 2002
  • Nonparametric statistical inference for the parameter of logit model were examined. Usually nonparametric approach is milder than parametric approach based on normal theory assumption. We compared the two nonparametric methods for legit model, the bootstrap and random permutation in the sense of coverage probability. Monte Carlo simulation is conducted for small sample cases. Empirical power of hypothesis test and coverage probability for confidence interval estimation were presented for simple and multiple legit model respectively. An example were also introduced.

A MULTIVARIATE JUMP DIFFUSION PROCESS FOR COUNTERPARTY RISK IN CDS RATES

  • Ramli, Siti Norafidah Mohd;Jang, Jiwook
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.19 no.1
    • /
    • pp.23-45
    • /
    • 2015
  • We consider counterparty risk in CDS rates. To do so, we use a multivariate jump diffusion process for obligors' default intensity, where jumps (i.e. magnitude of contribution of primary events to default intensities) occur simultaneously and their sizes are dependent. For these simultaneous jumps and their sizes, a homogeneous Poisson process. We apply copula-dependent default intensities of multivariate Cox process to derive the joint Laplace transform that provides us with joint survival/default probability and other relevant joint probabilities. For that purpose, the piecewise deterministic Markov process (PDMP) theory developed in [7] and the martingale methodology in [6] are used. We compute survival/default probability using three copulas, which are Farlie-Gumbel-Morgenstern (FGM), Gaussian and Student-t copulas, with exponential marginal distributions. We then apply the results to calculate CDS rates assuming deterministic rate of interest and recovery rate. We also conduct sensitivity analysis for the CDS rates by changing the relevant parameters and provide their figures.

A Study on The Factors which Influence on Evaluating Service Life for Carbonation of RC Structures (철근콘크리트 구조물의 탄산화 내구수명 산정에 미치는 영향요인에 관한 문헌적 연구)

  • Yang, Jae-Won;Yoon, Sun-Young;Cho, Hyung-Kyu;Song, Hun;Lee, Han-Seung
    • KIEAE Journal
    • /
    • v.10 no.3
    • /
    • pp.103-110
    • /
    • 2010
  • Carbonation is one of the major deterioration factors for concrete. So. lots of researchers have proposed the equations for determining carbonated depth and the initial time of steel corrosion due to carbonation to predict the service life of concrete structures. However, there are large gaps among the equations for predicting carbonation because each researcher has different considering factors to predict carbonation depth. So, in this study, we calculated the deviations of the proposed equations for carbonation, and we calculated each researcher different corrosion initiation time. However, it has a lot of deviation. Therefore, we evaluated the probability of steel corrosion considering each deviation using MCS, an analysis method based on probability theory. In the results, we have proposed much advanced information for determining service life of reinforced concrete structures due to carbonation.

A Node Scheduling Algorithm in Duty-Cycled Wireless Sensor Networks

  • Thi, Nga Dao;Dasgupta, Rumpa;Yoon, Seokhoon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.05a
    • /
    • pp.593-594
    • /
    • 2015
  • In wireless sensor networks (WSNs), due to the very low data rate, the sleeping schedule is usually used to save consumed energy and prolong the lifetime of nodes. However, duty-cycled approach can cause a high end-to-end (E2E) delay. In this paper, we study a node scheduling algorithm in WSNs such that E2E delay meets bounded delay with a given probability. We have applied the probability theory to spot the relationship between E2E delay and node interval. Simulation result illustrates that we can create the network to achieve given delay with prior probability and high energy use efficient as well.

  • PDF

A Study on the Design of Safety Work and the Measure of Safety for Accident Prevention (재해 예방을 위한 안전작업의 설계 및 안전도 측정에 관한 연구)

  • 이근희;김도희
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.17 no.31
    • /
    • pp.177-186
    • /
    • 1994
  • Most causes of accidents are due to physical unsafety conditions and human unsafety actions. The design of safety work by ergonomics method is one of the methodes which effectively reduce these unsafety conditions and unsafety actions. This paper presents considerations in design of safety work. And when we try to analyze the accident event by means of probability, there exist some problems because of fuzziness in physical unsafety conditions' components and human unsafety actions' components which are the causes of basic event. For this reason, it is impossible for input probability of basic event to define a crisp value. In consideration of the uncertain probability of components, this paper deals with the Fuzzy set theory by membership value and suggests calculation procedure and analysis of disaster event.

  • PDF

A Technique for the Quantitative Analysis of the Noise Jamming Effect (잡음재밍 효과에 대한 정량적 분석 기법)

  • Kim, Sung-Jin;Kang, Jong-Jin
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.8 no.4 s.23
    • /
    • pp.91-101
    • /
    • 2005
  • In this paper, a technique for the quantitative analysis of the noise jamming effect is proposed. This technique based upon the mathematical modeling for noise jammers and the probability theory for random processes analyses the jamming effect by means of the modeling of the relationship among jammer, radar variables and radar detection probability under noise jamming environment. Computer simulation results show that the proposed technique not only makes the quantitative analysis of the jamming effect possible, but also provides the basis for quantitative analysis of the electronic warfare environment.

On Performance Evaluation of Hybrid Decode-Amplify-Forward Relaying Protocol with Partial Relay Selection in Underlay Cognitive Networks

  • Duy, Tran Trung;Kong, Hyung Yun
    • Journal of Communications and Networks
    • /
    • v.16 no.5
    • /
    • pp.502-511
    • /
    • 2014
  • In this paper, we evaluate performance of a hybrid decode-amplify-forward relaying protocol in underlay cognitive radio. In the proposed protocol, a secondary relay which is chosen by partial relay selection method helps a transmission between a secondary source and a secondary destination. In particular, if the chosen relay decodes the secondary source's signal successfully, it will forward the decoded signal to the secondary destination. Otherwise, it will amplify the signal received from the secondary source and will forward the amplified signal to the secondary destination. We evaluate the performance of our scheme via theory and simulation. Results show that the proposed protocol outperforms the amplify-and-forward and decode-and-forward protocols in terms of outage probability.