• Title/Summary/Keyword: variational bayes

Search Result 6, Processing Time 0.022 seconds

Introduction to variational Bayes for high-dimensional linear and logistic regression models (고차원 선형 및 로지스틱 회귀모형에 대한 변분 베이즈 방법 소개)

  • Jang, Insong;Lee, Kyoungjae
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.3
    • /
    • pp.445-455
    • /
    • 2022
  • In this paper, we introduce existing Bayesian methods for high-dimensional sparse regression models and compare their performance in various simulation scenarios. Especially, we focus on the variational Bayes approach proposed by Ray and Szabó (2021), which enables scalable and accurate Bayesian inference. Based on simulated data sets from sparse high-dimensional linear regression models, we compare the variational Bayes approach with other Bayesian and frequentist methods. To check the practical performance of the variational Bayes in logistic regression models, a real data analysis is conducted using leukemia data set.

A variational Bayes method for pharmacokinetic model (약물동태학 모형에 대한 변분 베이즈 방법)

  • Parka, Sun;Jo, Seongil;Lee, Woojoo
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.1
    • /
    • pp.9-23
    • /
    • 2021
  • In the following paper we introduce a variational Bayes method that approximates posterior distributions with mean-field method. In particular, we introduce automatic differentiation variation inference (ADVI), which approximates joint posterior distributions using the product of Gaussian distributions after transforming parameters into real coordinate space, and then apply it to pharmacokinetic models that are models for the study of the time course of drug absorption, distribution, metabolism and excretion. We analyze real data sets using ADVI and compare the results with those based on Markov chain Monte Carlo. We implement the algorithms using Stan.

Implementation of Variational Bayes for Gaussian Mixture Models and Derivation of Factorial Variational Approximation (변분 근사화 분포의 유도 및 변분 베이지안 가우시안 혼합 모델의 구현)

  • Lee, Gi-Sung
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.9 no.5
    • /
    • pp.1249-1254
    • /
    • 2008
  • The crucial part of graphical model is to compute the posterior distribution of parameters plus with the hidden variables given the observed data. In this paper, implementation of variational Bayes method for Gaussian mixture model and derivation of factorial variational approximation have been proposed. This result can be used for data analysis tasks like information retrieval or data visualization.

HEVA: Cooperative Localization using a Combined Non-Parametric Belief Propagation and Variational Message Passing Approach

  • Oikonomou-Filandras, Panagiotis-Agis;Wong, Kai-Kit
    • Journal of Communications and Networks
    • /
    • v.18 no.3
    • /
    • pp.397-410
    • /
    • 2016
  • This paper proposes a novel cooperative localization method for distributed wireless networks in 3-dimensional (3D) global positioning system (GPS) denied environments. The proposed method, which is referred to as hybrid ellipsoidal variational algorithm (HEVA), combines the use of non-parametric belief propagation (NBP) and variational Bayes (VB) to benefit from both the use of the rich information in NBP and compact communication size of a parametric form. InHEVA, two novel filters are also employed. The first one mitigates non-line-of-sight (NLoS) time-of-arrival (ToA) messages, permitting it to work well in high noise environments with NLoS bias while the second one decreases the number of calculations. Simulation results illustrate that HEVA significantly outperforms traditional NBP methods in localization while requires only 50% of their complexity. The superiority of VB over other clustering techniques is also shown.

New Inference for a Multiclass Gaussian Process Classification Model using a Variational Bayesian EM Algorithm and Laplace Approximation

  • Cho, Wanhyun;Kim, Sangkyoon;Park, Soonyoung
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.4 no.4
    • /
    • pp.202-208
    • /
    • 2015
  • In this study, we propose a new inference algorithm for a multiclass Gaussian process classification model using a variational EM framework and the Laplace approximation (LA) technique. This is performed in two steps, called expectation and maximization. First, in the expectation step (E-step), using Bayes' theorem and the LA technique, we derive the approximate posterior distribution of the latent function, indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. In the maximization step, we compute the maximum likelihood estimators for hyper-parameters of a covariance matrix necessary to define the prior distribution of the latent function by using the posterior distribution derived in the E-step. These steps iteratively repeat until a convergence condition is satisfied. Moreover, we conducted the experiments by using synthetic data and Iris data in order to verify the performance of the proposed algorithm. Experimental results reveal that the proposed algorithm shows good performance on these datasets.

Variational Bayesian multinomial probit model with Gaussian process classification on mice protein expression level data (가우시안 과정 분류에 대한 변분 베이지안 다항 프로빗 모형: 쥐 단백질 발현 데이터에의 적용)

  • Donghyun Son;Beom Seuk Hwang
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.2
    • /
    • pp.115-127
    • /
    • 2023
  • Multinomial probit model is a popular model for multiclass classification and choice model. Markov chain Monte Carlo (MCMC) method is widely used for estimating multinomial probit model, but its computational cost is high. However, it is well known that variational Bayesian approximation is more computationally efficient than MCMC, because it uses subsets of samples. In this study, we describe multinomial probit model with Gaussian process classification and how to employ variational Bayesian approximation on the model. This study also compares the results of variational Bayesian multinomial probit model to the results of naive Bayes, K-nearest neighbors and support vector machine for the UCI mice protein expression level data.