• Title/Summary/Keyword: Prior Distributions

Search Result 207, Processing Time 0.025 seconds

A Methodology for Estimating the Uncertainty in Model Parameters Applying the Robust Bayesian Inferences

  • Kim, Joo Yeon;Lee, Seung Hyun;Park, Tai Jin
    • Journal of Radiation Protection and Research
    • /
    • v.41 no.2
    • /
    • pp.149-154
    • /
    • 2016
  • Background: Any real application of Bayesian inference must acknowledge that both prior distribution and likelihood function have only been specified as more or less convenient approximations to whatever the analyzer's true belief might be. If the inferences from the Bayesian analysis are to be trusted, it is important to determine that they are robust to such variations of prior and likelihood as might also be consistent with the analyzer's stated beliefs. Materials and Methods: The robust Bayesian inference was applied to atmospheric dispersion assessment using Gaussian plume model. The scopes of contaminations were specified as the uncertainties of distribution type and parametric variability. The probabilistic distribution of model parameters was assumed to be contaminated as the symmetric unimodal and unimodal distributions. The distribution of the sector-averaged relative concentrations was then calculated by applying the contaminated priors to the model parameters. Results and Discussion: The sector-averaged concentrations for stability class were compared by applying the symmetric unimodal and unimodal priors, respectively, as the contaminated one based on the class of ${\varepsilon}$-contamination. Though ${\varepsilon}$ was assumed as 10%, the medians reflecting the symmetric unimodal priors were nearly approximated within 10% compared with ones reflecting the plausible ones. However, the medians reflecting the unimodal priors were approximated within 20% for a few downwind distances compared with ones reflecting the plausible ones. Conclusion: The robustness has been answered by estimating how the results of the Bayesian inferences are robust to reasonable variations of the plausible priors. From these robust inferences, it is reasonable to apply the symmetric unimodal priors for analyzing the robustness of the Bayesian inferences.

A Comparison Study of Bayesian Methods for a Threshold Autoregressive Model with Regime-Switching (국면전환 임계 자기회귀 분석을 위한 베이지안 방법 비교연구)

  • Roh, Taeyoung;Jo, Seongil;Lee, Ryounghwa
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.6
    • /
    • pp.1049-1068
    • /
    • 2014
  • Autoregressive models are used to analyze an univariate time series data; however, these methods can be inappropriate when a structural break appears in a time series since they assume that a trend is consistent. Threshold autoregressive models (popular regime-switching models) have been proposed to address this problem. Recently, the models have been extended to two regime-switching models with delay parameter. We discuss two regime-switching threshold autoregressive models from a Bayesian point of view. For a Bayesian analysis, we consider a parametric threshold autoregressive model and a nonparametric threshold autoregressive model using Dirichlet process prior. The posterior distributions are derived and the posterior inferences is performed via Markov chain Monte Carlo method and based on two Bayesian threshold autoregressive models. We present a simulation study to compare the performance of the models. We also apply models to gross domestic product data of U.S.A and South Korea.

Bayesian Nonstationary Probability Rainfall Estimation using the Grid Method (Grid Method 기법을 이용한 베이지안 비정상성 확률강수량 산정)

  • Kwak, Dohyun;Kim, Gwangseob
    • Journal of Korea Water Resources Association
    • /
    • v.48 no.1
    • /
    • pp.37-44
    • /
    • 2015
  • A Bayesian nonstationary probability rainfall estimation model using the Grid method is developed. A hierarchical Bayesian framework is consisted with prior and hyper-prior distributions associated with parameters of the Gumbel distribution which is selected for rainfall extreme data. In this study, the Grid method is adopted instead of the Matropolis Hastings algorithm for random number generation since it has advantage that it can provide a thorough sampling of parameter space. This method is good for situations where the best-fit parameter values are not easily inferred a priori, and where there is a high probability of false minima. The developed model was applied to estimated target year probability rainfall using hourly rainfall data of Seoul station from 1973 to 2012. Results demonstrated that the target year estimate using nonstationary assumption is about 5~8% larger than the estimate using stationary assumption.

A Bayesian Approach to Geophysical Inverse Problems (베이지안 방식에 의한 지구물리 역산 문제의 접근)

  • Oh Seokhoon;Chung Seung-Hwan;Kwon Byung-Doo;Lee Heuisoon;Jung Ho Jun;Lee Duk Kee
    • Geophysics and Geophysical Exploration
    • /
    • v.5 no.4
    • /
    • pp.262-271
    • /
    • 2002
  • This study presents a practical procedure for the Bayesian inversion of geophysical data. We have applied geostatistical techniques for the acquisition of prior model information, then the Markov Chain Monte Carlo (MCMC) method was adopted to infer the characteristics of the marginal distributions of model parameters. For the Bayesian inversion of dipole-dipole array resistivity data, we have used the indicator kriging and simulation techniques to generate cumulative density functions from Schlumberger array resistivity data and well logging data, and obtained prior information by cokriging and simulations from covariogram models. The indicator approach makes it possible to incorporate non-parametric information into the probabilistic density function. We have also adopted the MCMC approach, based on Gibbs sampling, to examine the characteristics of a posteriori probability density function and the marginal distribution of each parameter.

Comparison of probability distributions to analyze the number of occurrence of torrential rainfall events (집중호우사상의 발생횟수 분석을 위한 확률분포의 비교)

  • Kim, Sang Ug;Kim, Hyeung Bae
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.6
    • /
    • pp.481-493
    • /
    • 2016
  • The statistical analysis to the torrential rainfall data that is defined as a rainfall amount more than 80 mm/day is performed with Daegu and Busan rainfall data which is collected during 384 months. The number of occurrence of the torrential rainfall events can be simulated usually using Poisson distribution. However, the Poisson distribution can be frequently failed to simulate the statistical characteristics of the observed value when the observed data is zero-inflated. Therefore, in this study, Generalized Poisson distribution (GPD), Zero-Inflated Poisson distribution (ZIP), Zero-Inflated Generalized Poisson distribution (ZIGP), and Bayesian ZIGP model were used to resolve the zero-inflated problem in the torrential rainfall data. Especially, in Bayesian ZIGP model, a informative prior distribution was used to increase the accuracy of that model. Finally, it was suggested that POI and GPD model should be discouraged to fit the frequency of the torrential rainfall data. Also, Bayesian ZIGP model using informative prior provided the most accurate results. Additionally, it was recommended that ZIP model could be alternative choice on the practical aspect since the Bayesian approach of this study was considerably complex.

Joint Segmentation of Multi-View Images by Region Correspondence (영역 대응을 이용한 다시점 영상 집합의 통합 영역화)

  • Lee, Soo-Chahn;Kwon, Dong-Jin;Yun, Il-Dong;Lee, Sang-Uk
    • Journal of Broadcast Engineering
    • /
    • v.13 no.5
    • /
    • pp.685-695
    • /
    • 2008
  • This paper presents a method to segment the object of interest from a set of multi-view images with minimal user interaction. Specifically, after the user segments an initial image, we first estimate the transformations between foreground and background of the segmented image and the neighboring image, respectively. From these transformations, we obtain regions in the neighboring image that respectively correspond to the foreground and the background of the segmented image. We are then able to segment the neighboring image based on these regions, and iterate this process to segment the whole image set. Transformation of foregrounds are estimated by feature-based registration with free-form deformation, while transformation of backgrounds are estimated by homography constrained to affine transformation. Here, both are based on correspondence point pairs. Segmentation is done by estimating pixel color distributions and defining a shape prior based on the obtained foreground and background regions and applying them to a Markov random field (MRF) energy minimization framework for image segmentation. Experimental results demonstrate the effectiveness of the proposed method.

Hardness Distribution and Microstructures of Electric Resistance Spot Welded 1GPa Grade Dual Phase Steel (1GPa급 DP강 전기저항점용접부의 경도분포와 미세조직의 상관관계)

  • Na, Hye-Sung;Kong, Jong-Pan;Han, Tae-Kyo;Chin, Kwang-Geun;Kang, Chung-Yun
    • Journal of Welding and Joining
    • /
    • v.30 no.2
    • /
    • pp.76-80
    • /
    • 2012
  • In this study, the effect of the welding current on the hardness characteristics and microstructure in the resistance spot welding of 1GPa grade cold-rolled DP steel was investigated, Also, correlation between the hardness and microstructure was discussed. In spite of the change in the welding current, the hardness distributions near weld was similar. the hardness in the HAZ and the fusion zone was higher than that of the base metal and the hardness in the fusion zone was variated with the location. Especially, the hardness of HAZ adjacent to the base metal showed maximum value, and softening zone in the base metal adjacent to HAZ was found. With the increasing of welding current, there were no difference in maximum hardness and average hardness in the fusion zone were, but the hardness of the softening zone reduced. The difference in the hardness in each location of weld due to grain size of prior austenite. The softening of the base metal occurred by tempering of the martensite.

COMPARISON OF SPECKLE REDUCTION METHODS FOR MULTISOURCE LAND-COVER CLASSIFICATION BY NEURAL NETWORK : A CASE STUDY IN THE SOUTH COAST OF KOREA

  • Ryu, Joo-Hyung;Won, Joong-Sun;Kim, Sang-Wan
    • Proceedings of the KSRS Conference
    • /
    • 1999.11a
    • /
    • pp.144-147
    • /
    • 1999
  • The objective of this study is to quantitatively evaluate the effects of various SAR speckle reduction methods for multisource land-cover classification by backpropagation neural network, especially over the coastal region. The land-cover classification using neural network has an advantage over conventional statistical approaches in that it is distribution-free and no prior knowledge of the statistical distributions of the classes is needed. The goal of multisource land-cover classification acquired by different sensors is to reduce the classification error, and consequently SAR can be utilized an complementary tool to optical sensors. SAR speckle is, however, an serious limiting factor when it is exploited for land-cover classification. In order to reduce this problem. we test various speckle methods including Frost, Median, Kuan and EPOS. Interpreting the weights about training pixel samples, the “Importance Value” of each SAR images that reduced speckle can be estimated based on its contribution to the classification. In this study, the “Importance Value” is used as a criterion of the effectiveness.

  • PDF

Thermal Characteristic Simulation and Property Evaluation of High Melting Point Materials by Pulsed Current Activated Sintering Process (PCAS공정에 의한 고융점 소결체 열전달 해석 및 특성분석)

  • Nam, Hyo-Eun;Jang, Jun-Ho;Park, Hyun-Kuk;Oh, Ik-Hyun
    • Journal of Sensor Science and Technology
    • /
    • v.26 no.3
    • /
    • pp.214-222
    • /
    • 2017
  • In this study, the effects of internal heat treatment associated sintering temperatures were simulated by the Finite Element Method (FEM). The sintering mechanism of pulsed current activated sintering process (PCAS) is still unclear because of some unexplainable heat transfer phenomena in coupled multi-physical fields, as well as the difficulty in measuring the interior temperatures of metal powder. We have carried out simulation study to find out thermal distributions between graphite mold and Ruthenium powder prior to PCAS process. For PCAS process, heating rate was maintained at $100^{\circ}C/min$ the simulation indicates that the sintering temperature range was between $1000^{\circ}C$ to $1300^{\circ}C$ under 60 MPa. The heat transfer inside the Ruthenium sintered-body sample was modelled through the whole process in order to predict the minimum interior temperature. Thermal simulation shows that the interior temperature gradient decreased by graphite punch length and calculation results well agreed with the PCAS field test results.

On using computational versus data-driven methods for uncertainty propagation of isotopic uncertainties

  • Radaideh, Majdi I.;Price, Dean;Kozlowski, Tomasz
    • Nuclear Engineering and Technology
    • /
    • v.52 no.6
    • /
    • pp.1148-1155
    • /
    • 2020
  • This work presents two different methods for quantifying and propagating the uncertainty associated with fuel composition at end of life for cask criticality calculations. The first approach, the computational approach uses parametric uncertainty including those associated with nuclear data, fuel geometry, material composition, and plant operation to perform forward depletion on Monte-Carlo sampled inputs. These uncertainties are based on experimental and prior experience in criticality safety. The second approach, the data-driven approach relies on using radiochemcial assay data to derive code bias information. The code bias data is used to perturb the isotopic inventory in the data-driven approach. For both approaches, the uncertainty in keff for the cask is propagated by performing forward criticality calculations on sampled inputs using the distributions obtained from each approach. It is found that the data driven approach yielded a higher uncertainty than the computational approach by about 500 pcm. An exploration is also done to see if considering correlation between isotopes at end of life affects keff uncertainty, and the results demonstrate an effect of about 100 pcm.