• Title/Summary/Keyword: Marginal likelihood

Search Result 78, Processing Time 0.025 seconds

Laplace-Metropolis Algorithm for Variable Selection in Multinomial Logit Model (Laplace-Metropolis알고리즘에 의한 다항로짓모형의 변수선택에 관한 연구)

  • 김혜중;이애경
    • Journal of Korean Society for Quality Management
    • /
    • v.29 no.1
    • /
    • pp.11-23
    • /
    • 2001
  • This paper is concerned with suggesting a Bayesian method for variable selection in multinomial logit model. It is based upon an optimal rule suggested by use of Bayes rule which minimizes a risk induced by selecting the multinomial logit model. The rule is to find a subset of variables that maximizes the marginal likelihood of the model. We also propose a Laplace-Metropolis algorithm intended to suggest a simple method forestimating the marginal likelihood of the model. Based upon two examples, artificial data and empirical data examples, the Bayesian method is illustrated and its efficiency is examined.

  • PDF

NEW LM TESTS FOR UNIT ROOTS IN SEASONAL AR PROCESSES

  • Oh, Yu-Jin;So, Beong-Soo
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.4
    • /
    • pp.447-456
    • /
    • 2007
  • On the basis of marginal likelihood of the residual vector which is free of nuisance mean parameters, we propose new Lagrange Multiplier seasonal unit root tests in seasonal autoregressive process. The limiting null distribution of the tests is the standardized ${\chi}^2-distribution$. A Monte-Carlo simulation shows the new tests are more powerful than the tests based on the ordinary least squares (OLS) estimator, especially for large number of seasons and short time spans.

Random Effects Models for Multivariate Survival Data: Hierarchical-Likelihood Approach

  • Ha Il Do;Lee Youngjo;Song Jae-Kee
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2000.11a
    • /
    • pp.193-200
    • /
    • 2000
  • Modelling the dependence via random effects in censored multivariate survival data has recently received considerable attention in the biomedical literature. The random effects models model not only the conditional survival times but also the conditional hazard rate. Systematic likelihood inference for the models with random effects is possible using Lee and Nelder's (1996) hierarchical-likelihood (h-likelihood). The purpose of this presentation is to introduce Ha et al.'s (2000a,b) inferential methods for the random effects models via the h-likelihood, which provide a conceptually simple, numerically efficient and reliable inferential procedures.

  • PDF

H-likelihood approach for variable selection in gamma frailty models

  • Ha, Il-Do;Cho, Geon-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.1
    • /
    • pp.199-207
    • /
    • 2012
  • Recently, variable selection methods using penalized likelihood with a shrink penalty function have been widely studied in various statistical models including generalized linear models and survival models. In particular, they select important variables and estimate coefficients of covariates simultaneously. In this paper, we develop a penalize h-likelihood method for variable selection in gamma frailty models. For this we use the smoothly clipped absolute deviation (SCAD) penalty function, which satisfies a good property in variable selection. The proposed method is illustrated using simulation study and a practical data set.

On an Optimal Bayesian Variable Selection Method for Generalized Logit Model

  • Kim, Hea-Jung;Lee, Ae Kuoung
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.617-631
    • /
    • 2000
  • This paper is concerned with suggesting a Bayesian method for variable selection in generalized logit model. It is based on Laplace-Metropolis algorithm intended to propose a simple method for estimating the marginal likelihood of the model. The algorithm then leads to a criterion for the selection of variables. The criterion is to find a subset of variables that maximizes the marginal likelihood of the model and it is seen to be a Bayes rule in a sense that it minimizes the risk of the variable selection under 0-1 loss function. Based upon two examples, the suggested method is illustrated and compared with existing frequentist methods.

  • PDF

Cumulative Sums of Residuals in GLMM and Its Implementation

  • Choi, DoYeon;Jeong, KwangMo
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.5
    • /
    • pp.423-433
    • /
    • 2014
  • Test statistics using cumulative sums of residuals have been widely used in various regression models including generalized linear models(GLM). Recently, Pan and Lin (2005) extended this testing procedure to the generalized linear mixed models(GLMM) having random effects, in which we encounter difficulties in computing the marginal likelihood that is expressed as an integral of random effects distribution. The Gaussian quadrature algorithm is commonly used to approximate the marginal likelihood. Many commercial statistical packages provide an option to apply this type of goodness-of-fit test in GLMs but available programs are very rare for GLMMs. We suggest a computational algorithm to implement the testing procedure in GLMMs by a freely accessible R package, and also illustrate through practical examples.

EMPIRICAL BAYES THRESHOLDING: ADAPTING TO SPARSITY WHEN IT ADVANTAGEOUS TO DO SO

  • Silverman Bernard W.
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.1
    • /
    • pp.1-29
    • /
    • 2007
  • Suppose one is trying to estimate a high dimensional vector of parameters from a series of one observation per parameter. Often, it is possible to take advantage of sparsity in the parameters by thresholding the data in an appropriate way. A marginal maximum likelihood approach, within a suitable Bayesian structure, has excellent properties. For very sparse signals, the procedure chooses a large threshold and takes advantage of the sparsity, while for signals where there are many non-zero values, the method does not perform excessive smoothing. The scope of the method is reviewed and demonstrated, and various theoretical, practical and computational issues are discussed, in particularly exploring the wide potential and applicability of the general approach, and the way it can be used within more complex thresholding problems such as curve estimation using wavelets.

On a Bayes Criterion for the Goodness-of-Link Test for Binary Response Regression Models : Probit Link versus Logit Link

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • v.26 no.2
    • /
    • pp.261-276
    • /
    • 1997
  • In the context of binary response regression, the problem of constructing Bayesian goodness-of-link test for testing logit link versus probit link is considered. Based upon the well known facts that cdf of logistic variate .approx. cdf of $t_{8}$/.634 and, as .nu. .to. .infty., cdf of $t_{\nu}$ approximates to that of N(0,1), Bayes factor is derived as a test criterion. A synthesis of the Gibbs sampling and a marginal likelihood estimation scheme is also proposed to compute the Bayes factor. Performance of the test is investigated via Monte Carlo study. The new test is also illustrated with an empirical data example.e.

  • PDF

Bayes factors for accelerated life testing models

  • Smit, Neill;Raubenheimer, Lizanne
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.5
    • /
    • pp.513-532
    • /
    • 2022
  • In this paper, the use of Bayes factors and the deviance information criterion for model selection are compared in a Bayesian accelerated life testing setup. In Bayesian accelerated life testing, the most used tool for model comparison is the deviance information criterion. An alternative and more formal approach is to use Bayes factors to compare models. However, Bayesian accelerated life testing models with more than one stressor often have mathematically intractable posterior distributions and Markov chain Monte Carlo methods are employed to obtain posterior samples to base inference on. The computation of the marginal likelihood is challenging when working with such complex models. In this paper, methods for approximating the marginal likelihood and the application thereof in the accelerated life testing paradigm are explored for dual-stress models. A simulation study is also included, where Bayes factors using the different approximation methods and the deviance information are compared.

Comparison of Two Dependent Agreements Using Test of Marginal Homogeneity (주변동질성검정법을 이용한 종속된 두 일치도의 비교)

  • Oh, Myong-Sik
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.4
    • /
    • pp.605-614
    • /
    • 2008
  • Oh (2008) has proposed the one-sided likelihood ratio test of the equality of two agreement measures. However the use of this test may be limited since the computations of test statistic and critical value are not easy. We propose a test for comparing two dependent agreements using some well known tests for marginal homogeneity, for instance, Bhapkar test, Stuart-Maxwell test. Data obtained from 2008 world figure skating championship ladies single is analyzed for illustration purposes.