• Title/Summary/Keyword: Bayesian analysis

Search Result 962, Processing Time 0.029 seconds

Assessment of uncertainty associated with parameter of gumbel probability density function in rainfall frequency analysis (강우빈도해석에서 Bayesian 기법을 이용한 Gumbel 확률분포 매개변수의 불확실성 평가)

  • Moon, Jang-Won;Moon, Young-Il;Kwon, Hyun-Han
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.5
    • /
    • pp.411-422
    • /
    • 2016
  • Rainfall-runoff modeling in conjunction with rainfall frequency analysis has been widely used for estimating design floods in South Korea. However, uncertainties associated with underlying distribution and sampling error have not been properly addressed. This study applied a Bayesian method to quantify the uncertainties in the rainfall frequency analysis along with Gumbel distribution. For a purpose of comparison, a probability weighted moment (PWM) was employed to estimate confidence interval. The uncertainties associated with design rainfalls were quantitatively assessed using both Bayesian and PWM methods. The results showed that the uncertainty ranges with PWM are larger than those with Bayesian approach. In addition, the Bayesian approach was able to effectively represent asymmetric feature of underlying distribution; whereas the PWM resulted in symmetric confidence interval due to the normal approximation. The use of long period data provided better results leading to the reduction of uncertainty in both methods, and the Bayesian approach showed better performance in terms of the reduction of the uncertainty.

Parameter Optimization and Uncertainty Analysis of the NWS-PC Rainfall-Runoff Model Coupled with Bayesian Markov Chain Monte Carlo Inference Scheme (Bayesian Markov Chain Monte Carlo 기법을 통한 NWS-PC 강우-유출 모형 매개변수의 최적화 및 불확실성 분석)

  • Kwon, Hyun-Han;Moon, Young-Il;Kim, Byung-Sik;Yoon, Seok-Young
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.28 no.4B
    • /
    • pp.383-392
    • /
    • 2008
  • It is not always easy to estimate the parameters in hydrologic models due to insufficient hydrologic data when hydraulic structures are designed or water resources plan are established. Therefore, uncertainty analysis are inevitably needed to examine reliability for the estimated results. With regard to this point, this study applies a Bayesian Markov Chain Monte Carlo scheme to the NWS-PC rainfall-runoff model that has been widely used, and a case study is performed in Soyang Dam watershed in Korea. The NWS-PC model is calibrated against observed daily runoff, and thirteen parameters in the model are optimized as well as posterior distributions associated with each parameter are derived. The Bayesian Markov Chain Monte Carlo shows a improved result in terms of statistical performance measures and graphical examination. The patterns of runoff can be influenced by various factors and the Bayesian approaches are capable of translating the uncertainties into parameter uncertainties. One could provide against an unexpected runoff event by utilizing information driven by Bayesian methods. Therefore, the rainfall-runoff analysis coupled with the uncertainty analysis can give us an insight in evaluating flood risk and dam size in a reasonable way.

Bayesian estimation of kinematic parameters of disk galaxies in large HI galaxy surveys

  • Oh, Se-Heon;Staveley-Smith, Lister
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.41 no.2
    • /
    • pp.62.2-62.2
    • /
    • 2016
  • We present a newly developed algorithm based on a Bayesian method for 2D tilted-ring analysis of disk galaxies which operates on velocity fields. Compared to the conventional ones based on a chi-squared minimisation procedure, this new Bayesian-based algorithm less suffers from local minima of the model parameters even with high multi-modality of their posterior distributions. Moreover, the Bayesian analysis implemented via Markov Chain Monte Carlo (MCMC) sampling only requires broad ranges of posterior distributions of the parameters, which makes the fitting procedure fully automated. This feature is essential for performing kinematic analysis of an unprecedented number of resolved galaxies from the upcoming Square Kilometre Array (SKA) pathfinders' galaxy surveys. A standalone code, the so-called '2D Bayesian Automated Tilted-ring fitter' (2DBAT) that implements the Bayesian fits of 2D tilted-ring models is developed for deriving rotation curves of galaxies that are at least marginally resolved (> 3 beams across the semi-major axis) and moderately inclined (20 < i < 70 degree). The main layout of 2DBAT and its performance test are discussed using sample galaxies from Australia Telescope Compact Array (ATCA) observations as well as artificial data cubes built based on representative rotation curves of intermediate-mass and massive spiral galaxies.

  • PDF

Comparative analysis of Bayesian and maximum likelihood estimators in change point problems with Poisson process

  • Kitabo, Cheru Atsmegiorgis;Kim, Jong Tae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.1
    • /
    • pp.261-269
    • /
    • 2015
  • Nowadays the application of change point analysis has been indispensable in a wide range of areas such as quality control, finance, environmetrics, medicine, geographics, and engineering. Identification of times where process changes would help minimize the consequences that might happen afterwards. The main objective of this paper is to compare the change-point detection capabilities of Bayesian estimate and maximum likelihood estimate. We applied Bayesian and maximum likelihood techniques to formulate change points having a step change and multiple number of change points in a Poisson rate. After a signal from c-chart and Poisson cumulative sum control charts have been detected, Monte Carlo simulation has been applied to investigate the performance of Bayesian and maximum likelihood estimation. Change point detection capacities of Bayesian and maximum likelihood estimation techniques have been investigated through simulation. It has been found that the Bayesian estimates outperforms standard control charts well specially when there exists a small to medium size of step change. Moreover, it performs convincingly well in comparison with the maximum like-lihood estimator and remains good choice specially in confidence interval statistical inference.

Uncertainty Analysis for Parameters of Probability Distribution in Rainfall Frequency Analysis by Bayesian MCMC and Metropolis Hastings Algorithm (Bayesian MCMC 및 Metropolis Hastings 알고리즘을 이용한 강우빈도분석에서 확률분포의 매개변수에 대한 불확실성 해석)

  • Seo, Young-Min;Park, Ki-Bum
    • Journal of Environmental Science International
    • /
    • v.20 no.3
    • /
    • pp.329-340
    • /
    • 2011
  • The probability concepts mainly used for rainfall or flood frequency analysis in water resources planning are the frequentist viewpoint that defines the probability as the limit of relative frequency, and the unknown parameters in probability model are considered as fixed constant numbers. Thus the probability is objective and the parameters have fixed values so that it is very difficult to specify probabilistically the uncertianty of these parameters. This study constructs the uncertainty evaluation model using Bayesian MCMC and Metropolis -Hastings algorithm for the uncertainty quantification of parameters of probability distribution in rainfall frequency analysis, and then from the application of Bayesian MCMC and Metropolis- Hastings algorithm, the statistical properties and uncertainty intervals of parameters of probability distribution can be quantified in the estimation of probability rainfall so that the basis for the framework configuration can be provided that can specify the uncertainty and risk in flood risk assessment and decision-making process.

Application of Bayesian Networks for Flood Risk Analysis (베이지안 네트워크를 적용한 홍수 위험도 분석)

  • SunWoo, Woo-Yeon;Lee, Kil-Seong;Chung, Eun-Sung
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2012.05a
    • /
    • pp.467-467
    • /
    • 2012
  • As the features of recent flood are spatially concentrated, loss of life and property increase by the impact of climate change. In addition to this the public interest in water control information is increased and socially reasonable justification of water control policy is needed. It is necessary to estimate the flood risk in order to let people know the status of flood control and establish flood control policy. For accurate flood risk analysis, we should consider inter-relation between causal factors of flood damage. Hence, flood risk analysis should be applied to interdependence of the factors selected. The Bayesian networks are ideally suited to assist decision-making in situations where there is uncertainty in the data and where the variables are highly interlinked. In this research, to provide more proper water control information the flood risk analysis is performed using the Bayesian networks to handle uncertainty and dependency among 13 specific proxy variables.

  • PDF

Bayesian Reliability Analysis Using Kriging Dimension Reduction Method (KDRM) (크리깅 기반 차원감소법을 이용한 베이지안 신뢰도 해석)

  • An, Da-Wn;Choi, Joo-Ho;Won, Jun-Ho
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2008.04a
    • /
    • pp.602-607
    • /
    • 2008
  • A technique for reliability-based design optimization(RBDO) is developed based on the Bayesian approach, which can deal with the epistemic uncertainty arising due to the limited number of data. Until recently, the conventional RBDO was implemented mostly by assuming the uncertainty as aleatory which means the statistical properties are completely known. In practice, however, this is not the case due to the insufficient data for estimating the statistical information, which makes the existing RBDO methods less useful. In this study, a Bayesian reliability is introduced to take account of the epistemic uncertainty, which is defined as the lower confidence bound of the probability distribution of the original reliability. In this case, the Bayesian reliability requires double loop of the conventional reliability analyses, which can be computationally expensive. Kriging based dimension reduction method(KDRM), which is a new efficient tool for the reliability analysis, is employed to this end. The proposed method is illustrated using a couple of numerical examples.

  • PDF

A Comparative Study on the Performance of Bayesian Partially Linear Models

  • Woo, Yoonsung;Choi, Taeryon;Kim, Wooseok
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.6
    • /
    • pp.885-898
    • /
    • 2012
  • In this paper, we consider Bayesian approaches to partially linear models, in which a regression function is represented by a semiparametric additive form of a parametric linear regression function and a nonparametric regression function. We make a comparative study on the performance of widely used Bayesian partially linear models in terms of empirical analysis. Specifically, we deal with three Bayesian methods to estimate the nonparametric regression function, one method using Fourier series representation, the other method based on Gaussian process regression approach, and the third method based on the smoothness of the function and differencing. We compare the numerical performance of three methods by the root mean squared error(RMSE). For empirical analysis, we consider synthetic data with simulation studies and real data application by fitting each of them with three Bayesian methods and comparing the RMSEs.

Bayesian Statistical Modeling of System Energy Saving Effectiveness for MAC Protocols of Wireless Sensor Networks: The Case of Non-Informative Prior Knowledge

  • Kim, Myong-Hee;Park, Man-Gon
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.6
    • /
    • pp.890-900
    • /
    • 2010
  • The Bayesian networks methods provide an efficient tool for performing information fusion and decision making under conditions of uncertainty. This paper proposes Bayes estimators for the system effectiveness in energy saving of the wireless sensor networks by use of the Bayesian method under the non-informative prior knowledge about means of active and sleep times based on time frames of sensor nodes in a wireless sensor network. And then, we conduct a case study on some Bayesian estimation models for the system energy saving effectiveness of a wireless sensor network, and evaluate and compare the performance of proposed Bayesian estimates of the system effectiveness in energy saving of the wireless sensor network. In the case study, we have recognized that the proposed Bayesian system energy saving effectiveness estimators are excellent to adapt in evaluation of energy efficiency using non-informative prior knowledge from previous experience with robustness according to given values of parameters.

Complex Segregation Analysis of Categorical Traits in Farm Animals: Comparison of Linear and Threshold Models

  • Kadarmideen, Haja N.;Ilahi, H.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.18 no.8
    • /
    • pp.1088-1097
    • /
    • 2005
  • Main objectives of this study were to investigate accuracy, bias and power of linear and threshold model segregation analysis methods for detection of major genes in categorical traits in farm animals. Maximum Likelihood Linear Model (MLLM), Bayesian Linear Model (BALM) and Bayesian Threshold Model (BATM) were applied to simulated data on normal, categorical and binary scales as well as to disease data in pigs. Simulated data on the underlying normally distributed liability (NDL) were used to create categorical and binary data. MLLM method was applied to data on all scales (Normal, categorical and binary) and BATM method was developed and applied only to binary data. The MLLM analyses underestimated parameters for binary as well as categorical traits compared to normal traits; with the bias being very severe for binary traits. The accuracy of major gene and polygene parameter estimates was also very low for binary data compared with those for categorical data; the later gave results similar to normal data. When disease incidence (on binary scale) is close to 50%, segregation analysis has more accuracy and lesser bias, compared to diseases with rare incidences. NDL data were always better than categorical data. Under the MLLM method, the test statistics for categorical and binary data were consistently unusually very high (while the opposite is expected due to loss of information in categorical data), indicating high false discovery rates of major genes if linear models are applied to categorical traits. With Bayesian segregation analysis, 95% highest probability density regions of major gene variances were checked if they included the value of zero (boundary parameter); by nature of this difference between likelihood and Bayesian approaches, the Bayesian methods are likely to be more reliable for categorical data. The BATM segregation analysis of binary data also showed a significant advantage over MLLM in terms of higher accuracy. Based on the results, threshold models are recommended when the trait distributions are discontinuous. Further, segregation analysis could be used in an initial scan of the data for evidence of major genes before embarking on molecular genome mapping.