A Bayesian Method for Narrowing the Scope fo Variable Selection in Binary Response t-Link Regression

  • Published : 2000.12.01

Abstract

This article is concerned with the selecting predictor variables to be included in building a class of binary response t-link regression models where both probit and logistic regression models can e approximately taken as members of the class. It is based on a modification of the stochastic search variable selection method(SSVS), intended to propose and develop a Bayesian procedure that used probabilistic considerations for selecting promising subsets of predictor variables. The procedure reformulates the binary response t-link regression setup in a hierarchical truncated normal mixture model by introducing a set of hyperparameters that will be used to identify subset choices. In this setup, the most promising subset of predictors can be identified as that with highest posterior probability in the marginal posterior distribution of the hyperparameters. To highlight the merit of the procedure, an illustrative numerical example is given.

Keywords

References

  1. Jouranl of the American Statistical Association v.88 Bayesian analysis of binary and polychotomous response data Albert, J. H.;Chib, S.
  2. Bayesian theeory Beranrdo, J. M.;Smith, A. F. M.
  3. CODA; Convergence diagnosis and output analysis software for Gibbs sampling output version 0.30 Best, N.;Cowles, M. K.;Vines, K.
  4. The American Statistician v.46 Ewplaining the Gibbs sampler Casella, G.;George, E.I.
  5. Jouranl of the American Statistical Association v.90 Marginal likelihood from the Gibbs output Chib, S.
  6. Modelling binary data Collett, D.
  7. Jouranl of the American Statistical Association v.91 Markov chain Monte Carlo convergence diagnostics: a comparative resiew Cowles, M. K.;Carlin, B. P.
  8. available on the MCMC preprint server On Bayesian model and variable selestion using MCMC Dellaportas, P.;Forster, J. J.;Ntzoufras, I.
  9. Non-uniform random generation Devroye, L.
  10. Probit analysis, 3rd ed. Finney, D. J.
  11. IEEE Transactions; Pattern Analysis and Machine Intelligence v.6 Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images Geman, S.;Geman, D.
  12. Journal of the Royal Statistical Society v.B56 Bayesian model choice: asymptotics and exact calculations Gelfand, A. E.;Dey, D. K.
  13. Statistical Science v.7 Inference from iterative simulation usting multiple sequences (with discussion) Gelman, A.;Rubin, D. B.
  14. Jouranl of the American Statistical Association v.88 Variable selection via Gibbs sampling George, E. I.;McCulloch, R. E.
  15. In Markov Chain Monte Carlo in Practice, Eds. W. R. Gilks, S. Richardson and D. S. Spiegelhalter Stochastic search variable selection George, E. I.;McCulloch, R. E.
  16. In Bayesian Analysis in Statistics and Econometrics: Essays in Honor of Arnold Zellner, Eds. E. A. Berry, K. M. Chaloner an J. K. Geweke Two approaches to Bayesian model selection with applications George, E. I.;McCulloch, R. E.;Tsay, R.
  17. In Bayesian Statistics 5,Eds. J. M. Bernardo, J. O. Berger, A. P. Dawid and A. F. M. Smith Variable selection and model comparison in regression Geweke, J.
  18. Applied Statistics v.41 Adaptive rejection sampling for Gibbs sampling Gilks, W. R.;Wild, P.
  19. Technical Report 5-94-03 Reversible jump MCMC computation and Bayesian model determination Green, P. J.
  20. Journal of the American Statistical Association v.90 Information distinguishability with applacation to analysis of failure data Kuo, S. S.;Ebrahimi, N.;Habibullah, M.
  21. Journal of the American Statistical Association v.90 Information distinguishability with application to analysis of failure data Soofi, S. S.;Ebrahimi, N.;Habibullah, M.
  22. Journal of the American Statistical Association v.82 The calculation of posterior distributions by data augmentation Tanner, T. A.;Wong, W. H.
  23. Annals of Statistics v.22 Markov chains for exploring posterior distributions Tierney, L.