• Title/Summary/Keyword: 변분-베이즈 추정

Search Result 2, Processing Time 0.016 seconds

Variational Bayesian multinomial probit model with Gaussian process classification on mice protein expression level data (가우시안 과정 분류에 대한 변분 베이지안 다항 프로빗 모형: 쥐 단백질 발현 데이터에의 적용)

  • Donghyun Son;Beom Seuk Hwang
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.2
    • /
    • pp.115-127
    • /
    • 2023
  • Multinomial probit model is a popular model for multiclass classification and choice model. Markov chain Monte Carlo (MCMC) method is widely used for estimating multinomial probit model, but its computational cost is high. However, it is well known that variational Bayesian approximation is more computationally efficient than MCMC, because it uses subsets of samples. In this study, we describe multinomial probit model with Gaussian process classification and how to employ variational Bayesian approximation on the model. This study also compares the results of variational Bayesian multinomial probit model to the results of naive Bayes, K-nearest neighbors and support vector machine for the UCI mice protein expression level data.

Speech Enhancement Using Nonnegative Matrix Factorization with Temporal Continuity (시간 연속성을 갖는 비음수 행렬 분해를 이용한 음질 개선)

  • Nam, Seung-Hyon
    • The Journal of the Acoustical Society of Korea
    • /
    • v.34 no.3
    • /
    • pp.240-246
    • /
    • 2015
  • In this paper, speech enhancement using nonnegative matrix factorization with temporal continuity has been addressed. Speech and noise signals are modeled as Possion distributions, and basis vectors and gain vectors of NMF are modeled as Gamma distributions. Temporal continuity of the gain vector is known to be critical to the quality of enhanced speech signals. In this paper, temporal continiuty is implemented by adopting Gamma-Markov chain priors for noise gain vectors during the separation phase. Simulation results show that the Gamma-Markov chain models temporal continuity of noise signals and track changes in noise effectively.