• Title/Summary/Keyword: Gibbs algorithm

Search Result 90, Processing Time 0.03 seconds

Inverted exponentiated Weibull distribution with applications to lifetime data

  • Lee, Seunghyung;Noh, Yunhwan;Chung, Younshik
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.3
    • /
    • pp.227-240
    • /
    • 2017
  • In this paper, we introduce the inverted exponentiated Weibull (IEW) distribution which contains exponentiated inverted Weibull distribution, inverse Weibull (IW) distribution, and inverted exponentiated distribution as submodels. The proposed distribution is obtained by the inverse form of the exponentiated Weibull distribution. In particular, we explain that the proposed distribution can be interpreted by Marshall and Olkin's book (Lifetime Distributions: Structure of Non-parametric, Semiparametric, and Parametric Families, 2007, Springer) idea. We derive the cumulative distribution function and hazard function and calculate expression for its moment. The hazard function of the IEW distribution can be decreasing, increasing or bathtub-shaped. The maximum likelihood estimation (MLE) is obtained. Then we show the existence and uniqueness of MLE. We can also obtain the Bayesian estimation by using the Gibbs sampler with the Metropolis-Hastings algorithm. We also give applications with a simulated data set and two real data set to show the flexibility of the IEW distribution. Finally, conclusions are mentioned.

Bayesian Change Point Analysis for a Sequence of Normal Observations: Application to the Winter Average Temperature in Seoul (정규확률변수 관측치열에 대한 베이지안 변화점 분석 : 서울지역 겨울철 평균기온 자료에의 적용)

  • 김경숙;손영숙
    • The Korean Journal of Applied Statistics
    • /
    • v.17 no.2
    • /
    • pp.281-301
    • /
    • 2004
  • In this paper we consider the change point problem in a sequence of univariate normal observations. We want to know whether there is any change point or not. In case a change point exists, we will identify its change type. Namely, it can be a mean change, a variance change, or both the mean and variance change. The intrinsic Bayes factors of Berger and Pericchi (1996, 1998) are used to find the type of optimal change model. The Gibbs sampling including the Metropolis-Hastings algorithm is used to estimate all the parameters in the change model. These methods are checked via simulation and applied to the winter average temperature data in Seoul.

An Improved 2-D Moment Algorithm for Pattern Classification

  • Yoon, myoung-Young
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.4 no.2
    • /
    • pp.1-6
    • /
    • 1999
  • We propose a new algorithm for pattern classification by extracting feature vectors based on Gibbs distributions which are well suited for representing the characteristic of an images. The extracted feature vectors are comprised of 2-D moments which are invariant under translation rotation, and scale of the image less sensitive to noise. This implementation contains two puts: feature extraction and pattern classification First of all, we extract feature vector which consists of an improved 2-D moments on the basis of estimated Gibbs distribution Next, in the classification phase the minimization of the discrimination cost function for a specific pattern determines the corresponding template pattern. In order to evaluate the performance of the proposed scheme, classification experiments with training document sets of characters have been carried out on SUN ULTRA 10 Workstation Experiment results reveal that the proposed scheme had high classification rate over 98%.

  • PDF

Performance of Image Reconstruction Techniques for Efficient Multimedia Transmission of Multi-Copter (멀티콥터의 효율적 멀티미디어 전송을 위한 이미지 복원 기법의 성능)

  • Hwang, Yu Min;Lee, Sun Yui;Lee, Sang Woon;Kim, Jin Young
    • Journal of Satellite, Information and Communications
    • /
    • v.9 no.4
    • /
    • pp.104-110
    • /
    • 2014
  • This paper considers two reconstruction schemes of structured-sparse signals, turbo inference and Markov chain Monte Carlo (MCMC) inference, in compressed sensing(CS) technique that is recently getting an important issue for an efficient video wireless transmission system using multi-copter as an unmanned aerial vehicle. Proposed reconstruction algorithms are setting importance on reduction of image data sizes, fast reconstruction speed and errorless reconstruction. As a result of experimentation with twenty kinds of images, we can find turbo reconstruction algorithm based on loopy belief propagation(BP) has more excellent performances than MCMC algorithm based on Gibbs sampling as aspects of average reconstruction computation time, normalized mean squared error(NMSE) values.

Bayesian analysis of random partition models with Laplace distribution

  • Kyung, Minjung
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.5
    • /
    • pp.457-480
    • /
    • 2017
  • We develop a random partition procedure based on a Dirichlet process prior with Laplace distribution. Gibbs sampling of a Laplace mixture of linear mixed regressions with a Dirichlet process is implemented as a random partition model when the number of clusters is unknown. Our approach provides simultaneous partitioning and parameter estimation with the computation of classification probabilities, unlike its counterparts. A full Gibbs-sampling algorithm is developed for an efficient Markov chain Monte Carlo posterior computation. The proposed method is illustrated with simulated data and one real data of the energy efficiency of Tsanas and Xifara (Energy and Buildings, 49, 560-567, 2012).

DEFAULT BAYESIAN INFERENCE OF REGRESSION MODELS WITH ARMA ERRORS UNDER EXACT FULL LIKELIHOODS

  • Son, Young-Sook
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.2
    • /
    • pp.169-189
    • /
    • 2004
  • Under the assumption of default priors, such as noninformative priors, Bayesian model determination and parameter estimation of regression models with stationary and invertible ARMA errors are developed under exact full likelihoods. The default Bayes factors, the fractional Bayes factor (FBF) of O'Hagan (1995) and the arithmetic intrinsic Bayes factors (AIBF) of Berger and Pericchi (1996a), are used as tools for the selection of the Bayesian model. Bayesian estimates are obtained by running the Metropolis-Hastings subchain in the Gibbs sampler. Finally, the results of numerical studies, designed to check the performance of the theoretical results discussed here, are presented.

Bayesian Approach for Determining the Order p in Autoregressive Models

  • Kim, Chansoo;Chung, Younshik
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.3
    • /
    • pp.777-786
    • /
    • 2001
  • The autoregressive models have been used to describe a wade variety of time series. Then the problem of determining the order in the times series model is very important in data analysis. We consider the Bayesian approach for finding the order of autoregressive(AR) error models using the latent variable which is motivated by Tanner and Wong(1987). The latent variables are combined with the coefficient parameters and the sequential steps are proposed to set up the prior of the latent variables. Markov chain Monte Carlo method(Gibbs sampler and Metropolis-Hasting algorithm) is used in order to overcome the difficulties of Bayesian computations. Three examples including AR(3) error model are presented to illustrate our proposed methodology.

  • PDF

Bayesian Inference for Switching Mean Models with ARMA Errors

  • Son, Young Sook;Kim, Seong W.;Cho, Sinsup
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.981-996
    • /
    • 2003
  • Bayesian inference is considered for switching mean models with the ARMA errors. We use noninformative improper priors or uniform priors. The fractional Bayes factor of O'Hagan (1995) is used as the Bayesian tool for detecting the existence of a single change or multiple changes and the usual Bayes factor is used for identifying the orders of the ARMA error. Once the model is fully identified, the Gibbs sampler with the Metropolis-Hastings subchains is constructed to estimate parameters. Finally, we perform a simulation study to support theoretical results.

Bayesian Hierachical Model using Gibbs Sampler Method: Field Mice Example (깁스 표본 기법을 이용한 베이지안 계층적 모형: 야생쥐의 예)

  • Song, Jae-Kee;Lee, Gun-Hee;Ha, Il-Do
    • Journal of the Korean Data and Information Science Society
    • /
    • v.7 no.2
    • /
    • pp.247-256
    • /
    • 1996
  • In this paper, we applied bayesian hierarchical model to analyze the field mice example introduced by Demster et al.(1981). For this example, we use Gibbs sampler method to provide the posterior mean and compared it with LSE(Least Square Estimator) and MLR(Maximum Likelihood estimator with Random effect) via the EM algorithm.

  • PDF

Generative probabilistic model with Dirichlet prior distribution for similarity analysis of research topic

  • Milyahilu, John;Kim, Jong Nam
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.4
    • /
    • pp.595-602
    • /
    • 2020
  • We propose a generative probabilistic model with Dirichlet prior distribution for topic modeling and text similarity analysis. It assigns a topic and calculates text correlation between documents within a corpus. It also provides posterior probabilities that are assigned to each topic of a document based on the prior distribution in the corpus. We then present a Gibbs sampling algorithm for inference about the posterior distribution and compute text correlation among 50 abstracts from the papers published by IEEE. We also conduct a supervised learning to set a benchmark that justifies the performance of the LDA (Latent Dirichlet Allocation). The experiments show that the accuracy for topic assignment to a certain document is 76% for LDA. The results for supervised learning show the accuracy of 61%, the precision of 93% and the f1-score of 96%. A discussion for experimental results indicates a thorough justification based on probabilities, distributions, evaluation metrics and correlation coefficients with respect to topic assignment.