• Title/Summary/Keyword: probabilistic process

Search Result 391, Processing Time 0.022 seconds

Mechanical Integrity Evaluation on the Degraded Cladding Tube of Spent Nuclear Fuel Under Axial and Bending Loads During Transportation

  • Lee, Seong-Ki;Lee, Dong-Hyo;Park, Joon-Kyoo;Kim, Jae-Hoon
    • Journal of Nuclear Fuel Cycle and Waste Technology(JNFCWT)
    • /
    • v.19 no.4
    • /
    • pp.491-501
    • /
    • 2021
  • This paper aims to evaluate the mechanical integrity for Spent Nuclear Fuel (SNF) cladding under lateral loads during transportation. The evaluation process requires a conservative consideration of the degradation conditions of SNF cladding, especially the hydride effect, which reduces the ductility of the cladding. The dynamic forces occurring during the drop event are pinch force, axial force and bending moment. Among those forces, axial force and bending moment can induce transverse tearing of cladding. Our assessment of 14 × 14 PWR SNF was performed using finite element analysis considering SNF characteristics. We also considered the probabilistic procedures with a Monte Carlo method and a reliability evaluation. The evaluation results revealed that there was no probability of damage under normal conditions, and that under accident conditions the probability was small for transverse failure mode.

Derivation of uncertainty importance measure and its application

  • Park, Chang-K.
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1990.04a
    • /
    • pp.272-288
    • /
    • 1990
  • The uncertainty quantification process in probabilistic Risk Assessment usually involves a specification of the uncertainty in the input data and the propagation of this uncertainty to the final risk results. The distributional sensitivity analysis is to study the impact of the various assumptions made during the quantification of input parameter uncertainties on the final output uncertainty. The uncertainty importance of input parameters, in this case, should reflect the degree of changes in the whole output distribution and not just in a point estimate value. A measure of the uncertainty importance is proposed in the present paper. The measure is called the distributional sensitivity measure(DSM) and explicitly derived from the definition of the Kullback's discrimination information. The DSM is applied to three typical discrimination information. The DSM is applied to three typical cases of input distributional changes: 1) Uncertainty is completely eliminated, 2) Uncertainty range is increased by a factor of 10, and 3) Type of distribution is changed. For all three cases of application, the DSM-based importance ranking agrees very well with the observed changes of output distribution while other statistical parameters are shown to be insensitive.

  • PDF

Application of Markov Chains and Monte Carlo Simulations for Pavement Construction Engineering

  • Nega, Ainalem;Gedafa, Daba
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.1043-1050
    • /
    • 2022
  • Markov chains and Monte Carlo Simulation were applied to account for the probabilistic nature of pavement deterioration over time using data collected in the field. The primary purpose of this study was to evaluate pavement network performance of Western Australia (WA) by applying the existing pavement management tools relevant to WA road construction networks. Two approaches were used to analyze the pavement networks: evaluating current pavement performance data to assess WA State Road networks and predicting the future states using past and current pavement data. The Markov chains process and Monte Carlo Simulation methods were used to predicting future conditions. The results indicated that Markov chains and Monte Carlo Simulation prediction models perform well compared to pavement performance data from the last four decades. The results also revealed the impact of design, traffic demand, and climate and construction standards on urban pavement performance. This study recommends an appropriate and effective pavement engineering management system for proper pavement design and analysis, preliminary planning, future pavement maintenance and rehabilitation, service life, and sustainable pavement construction functionality.

  • PDF

Development of Application Method of Influent Wastewater Generation and Activated Sludge Process Design Based on Probability Density Function (확률밀도함수 기반 유입하수 재현 및 활성슬러지공정 설계기법 개발)

  • You, Kwangtae;Kim, Jongrack;Yun, Zuhwan;Pak, Gijung
    • Journal of Korean Society on Water Environment
    • /
    • v.33 no.2
    • /
    • pp.140-148
    • /
    • 2017
  • An important factor in determining the design and treatment efficiency of wastewater treatment plants (WWTPs) is the quantity and quality of influent. These detailed and accurate information is essential for process control, diagnosis and operation, as well as the basis in designing the plant, selecting the process and determining the optimal capacity of each bioreactor. Probabilistic models are used to predict the wastewater quantity and quality of WWTPs, which are widely used to improve the design and operation of WWTPs. In this study, the optimal probability distribution of time series influent data was derived for predicting water quantity and quality, and wastewater influent data were generated using the Monte Carlo simulation analysis. In addition, we estimated various alternatives for the improvement of bioreactor operations based on present operation condition using the generated influent data and activated sludge model, and suggested the alternative that can operate the most effectively. Thus, the influent quantity and quality are highly correlated with the actual operation data, so that the actual WWTPs influent characteristics were well reproduced. Using this will improve the operating conditions of WWTPs, and a proposed improvement plan for the current TMS (Tele Monitoring System) effluent quality standards can be made.

A Multi-stage Markov Process Model to Evaluate the Performance of Priority Queues in Discrete-Event Simulation: A Case Study with a War Game Model (이산사건 시뮬레이션에서의 우선순위 큐 성능분석을 위한 다단계 마코브 프로세스 모델: 창조 모델에 대한 사례연구)

  • Yim, Dong-Soon
    • Journal of the Korea Society for Simulation
    • /
    • v.17 no.4
    • /
    • pp.61-69
    • /
    • 2008
  • In order to evaluate the performance of priority queues for future event list in discrete-event simulations, models representing patterns of enqueue and dequeue processes are required. The time complexities of diverse priority queue implementations can be compared using the performance models. This study aims at developing such performance models especially under the environment that a developed simulation model is used repeatedly for a long period. The developed performance model is based on multi-stage Markov process models; probabilistic patterns of enqueue and dequeue are considered by incorporating non-homogeneous transition probability. All necessary parameters in this performance model would be estimated by analyzing a results obtained by executing the simulation model. A case study with a war game simulation model shows how the parameters defined in muti-stage Markov process models are estimated.

  • PDF

Assessment of modal parameters considering measurement and modeling errors

  • Huang, Qindan;Gardoni, Paolo;Hurlebaus, Stefan
    • Smart Structures and Systems
    • /
    • v.15 no.3
    • /
    • pp.717-733
    • /
    • 2015
  • Modal parameters of a structure are commonly used quantities for system identification and damage detection. With a limited number of studies on the statistics assessment of modal parameters, this paper presents procedures to properly account for the uncertainties present in the process of extracting modal parameters. Particularly, this paper focuses on how to deal with the measurement error in an ambient vibration test and the modeling error resulting from a modal parameter extraction process. A bootstrap approach is adopted, when an ensemble of a limited number of noised time-history response recordings is available. To estimate the modeling error associated with the extraction process, a model prediction expansion approach is adopted where the modeling error is considered as an "adjustment" to the prediction obtained from the extraction process. The proposed procedures can be further incorporated into the probabilistic analysis of applications where the modal parameters are used. This study considers the effects of the measurement and modeling errors and can provide guidance in allocating resources to improve the estimation accuracy of the modal data. As an illustration, the proposed procedures are applied to extract the modal data of a damaged beam, and the extracted modal data are used to detect potential damage locations using a damage detection method. It is shown that the variability in the modal parameters can be considered to be quite low due to the measurement and modeling errors; however, this low variability has a significant impact on the damage detection results for the studied beam.

Analysis and Calculation of Factors Influencing the Sortie Generation Rate (SGR) of Aircraft-carrying Naval Ships (함재기탑재 함정의 소티 생성률(Sortie Generation Rate) 영향인자 분석 및 산출 연구)

  • Sunah Jung;Heechang Yoon;Seungheon Oh;Jonghoon Woo;Sangwoo Bae;Dongi Park;Woongsub Lee;Jaehyuk Lee;Hyuk Lee;Junghoon Chung
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.61 no.4
    • /
    • pp.267-277
    • /
    • 2024
  • The Sortie Generation Rate (SGR) is a critical performance indicator for carrier-based aircraft and is a key factor for the carrier design process. This study aims to analyze the factors that affect SGR and establish a representative Sortie Generation Process (SGP) along with simulation results to calculate SGR for a naval ship equipped to carry aircraft. Detailed SGR factors are identified from the perspectives of the aircraft, aviation personnel, and aircraft carrier during the flight preparation stage, and the SGP is established accordingly. As a representative, Korean Navy's CVX basic design is chosen for detailed analysis. The physical dimension and spots for the deck design with time and probabilistic data of SGP are considered to develop a queueing network model for SGR calculation. To consider the specific probabilistic features, the model was solved with discrete event simulation tools(SimPy and AnyLogic) where the results show great agreement. Such findings on SGR factors and calculation are expected to be incorporated in the future development of SGR calculation algorithms and also present guidelines for proper design of aircraft carrier based on concrete operation concept.

Non-Simultaneous Sampling Deactivation during the Parameter Approximation of a Topic Model

  • Jeong, Young-Seob;Jin, Sou-Young;Choi, Ho-Jin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.1
    • /
    • pp.81-98
    • /
    • 2013
  • Since Probabilistic Latent Semantic Analysis (PLSA) and Latent Dirichlet Allocation (LDA) were introduced, many revised or extended topic models have appeared. Due to the intractable likelihood of these models, training any topic model requires to use some approximation algorithm such as variational approximation, Laplace approximation, or Markov chain Monte Carlo (MCMC). Although these approximation algorithms perform well, training a topic model is still computationally expensive given the large amount of data it requires. In this paper, we propose a new method, called non-simultaneous sampling deactivation, for efficient approximation of parameters in a topic model. While each random variable is normally sampled or obtained by a single predefined burn-in period in the traditional approximation algorithms, our new method is based on the observation that the random variable nodes in one topic model have all different periods of convergence. During the iterative approximation process, the proposed method allows each random variable node to be terminated or deactivated when it is converged. Therefore, compared to the traditional approximation ways in which usually every node is deactivated concurrently, the proposed method achieves the inference efficiency in terms of time and memory. We do not propose a new approximation algorithm, but a new process applicable to the existing approximation algorithms. Through experiments, we show the time and memory efficiency of the method, and discuss about the tradeoff between the efficiency of the approximation process and the parameter consistency.

Decision-making of alternative pylon shapes of a benchmark cable-stayed bridge using seismic risk assessment

  • Akhoondzade-Noghabi, Vahid;Bargi, Khosrow
    • Earthquakes and Structures
    • /
    • v.11 no.4
    • /
    • pp.583-607
    • /
    • 2016
  • One of the main applications of seismic risk assessment is that an specific design could be selected for a bridge from different alternatives by considering damage losses alongside primary construction costs. Therefore, in this paper, the focus is on selecting the shape of pylon, which is a changeable component in the design of a cable-stayed bridge, as a double criterion decision-making problem. Different shapes of pylons include H, A, Y, and diamond shape, and the two criterion are construction costs and probable earthquake losses. In this research, decision-making is performed by using developed seismic risk assessment process as a powerful method. Considering the existing uncertainties in seismic risk assessment process, the combined incremental dynamic analysis (IDA) and uniform design (UD) based fragility assessment method is proposed, in which the UD method is utilized to provide the logical capacity models of the structure, and the IDA method is employed to give the probabilistic seismic demand model of structure. Using the aforementioned models and by defining damage states, the fragility curves of the bridge system are obtained for the different pylon shapes usage. Finally, by combining the fragility curves with damage losses and implementing the proposed cost-loss-benefit (CLB) method, the seismic risk assessment process is developed with financial-comparative approach. Thus, the optimal shape of the pylon can be determined using double criterion decision-making. The final results of decision-making study indicate that the optimal pylon shapes for the studied span of cable-stayed bridge are, respectively, H shape, diamond shape, Y shape, and A shape.

An Introduction to Kinetic Monte Carlo Methods for Nano-scale Diffusion Process Modeling (나노 스케일 확산 공정 모사를 위한 동력학적 몬테칼로 소개)

  • Hwang, Chi-Ok;Seo, Ji-Hyun;Kwon, Oh-Seob;Kim, Ki-Dong;Won, Tae-Young
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.41 no.6
    • /
    • pp.25-31
    • /
    • 2004
  • In this paper, we introduce kinetic Monte Carlo (kMC) methods for simulating diffusion process in nano-scale device fabrication. At first, we review kMC theory and backgrounds and give a simple point defect diffusion process modeling in thermal annealing after ion (electron) implantation into Si crystalline substrate to help understand kinetic Monte Carlo methods. kMC is a kind of Monte Carlo but can simulate time evolution of diffusion process through Poisson probabilistic process. In kMC diffusion process, instead of. solving differential reaction-diffusion equations via conventional finite difference or element methods, it is based on a series of chemical reaction (between atoms and/or defects) or diffusion events according to event rates of all possible events. Every event has its own event rate and time evolution of semiconductor diffusion process is directly simulated. Those event rates can be derived either directly from molecular dynamics (MD) or first-principles (ab-initio) calculations, or from experimental data.