BayesLogit

Bayesian inference for logistic models using Pólya-Gamma latent variables. We propose a new data-augmentation strategy for fully Bayesian inference in models with binomial likelihoods. The approach appeals to a new class of Pólya-Gamma distributions, which are constructed in detail. A variety of examples are presented to show the versatility of the method, including logistic regression, negative binomial regression, nonlinear mixed-effect models, and spatial models for count data. In each case, our data-augmentation strategy leads to simple, effective methods for posterior inference that (1) circumvent the need for analytic approximations, numerical integration, or Metropolis-Hastings; and (2) outperform other known data-augmentation strategies, both in ease of use and in computational efficiency. All methods, including an efficient sampler for the Pólya-Gamma distribution, are implemented in the R package BayesLogit. Supplementary materials for this article are available online.


References in zbMATH (referenced in 43 articles )

Showing results 1 to 20 of 43.
Sorted by year (citations)

1 2 3 next

  1. Mazzarisi, P.; Barucca, P.; Lillo, F.; Tantari, D.: A dynamic network model with persistent links and node-specific latent variables, with an application to the interbank market (2020)
  2. Wei, Ran; Ghosal, Subhashis: Contraction properties of shrinkage priors in logistic regression (2020)
  3. Bertolacci, Michael; Cripps, Edward; Rosen, Ori; Lau, John W.; Cripps, Sally: Climate inference on daily rainfall across the Australian continent, 1876--2015 (2019)
  4. Cadonna, Annalisa; Kottas, Athanasios; Prado, Raquel: Bayesian spectral modeling for multiple time series (2019)
  5. Chakraborty, Saptarshi; Khare, Kshitij: Consistent estimation of the spectrum of trace class data augmentation algorithms (2019)
  6. De Wiel, Mark A. van; Te Beest, Dennis E.; Münch, Magnus M.: Learning from a lot: empirical Bayes for high-dimensional model-based prediction (2019)
  7. Durante, Daniele; Canale, Antonio; Rigon, Tommaso: A nested expectation-maximization algorithm for latent class models with covariates (2019)
  8. Durante, Daniele; Rigon, Tommaso: Conditionally conjugate mean-field variational Bayes for logistic models (2019)
  9. Durmus, Alain; Moulines, Éric: High-dimensional Bayesian inference via the unadjusted Langevin algorithm (2019)
  10. Frühwirth-Schnatter, Sylvia: Keeping the balance -- bridge sampling for marginal likelihood estimation in finite mixture, mixture of experts and Markov mixture models (2019)
  11. Frühwirth-Schnatter, Sylvia; Malsiner-Walli, Gertraud: From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering (2019)
  12. Glynn, Chris; Tokdar, Surya T.; Howard, Brian; Banks, David L.: Bayesian analysis of dynamic linear topic models (2019)
  13. Jiang, Zhehan; Templin, Jonathan: Gibbs samplers for logistic item response models via the Pólya-gamma distribution: a computationally efficient data-augmentation strategy (2019)
  14. Mastrantonio, Gianluca; Grazian, Clara; Mancinelli, Sara; Bibbona, Enrico: New formulation of the logistic-Gaussian process to analyze trajectory tracking data (2019)
  15. Neelon, Brian: Bayesian zero-inflated negative binomial regression based on Pólya-gamma mixtures (2019)
  16. Sharma, Archit; Saxena, Siddhartha; Rai, Piyush: A flexible probabilistic framework for large-margin mixture of experts (2019)
  17. Xia, Ye-Mao; Tang, Nian-Sheng: Bayesian analysis for mixture of latent variable hidden Markov models with multivariate longitudinal data (2019)
  18. Zens, Gregor: Bayesian shrinkage in mixture-of-experts models: identifying robust determinants of class membership (2019)
  19. Zhang, Chun-Xia; Xu, Shuang; Zhang, Jiang-She: A novel variational Bayesian method for variable selection in logistic regression models (2019)
  20. Chen, Ning; Zhu, Jun; Chen, Jianfei; Chen, Ting: Dropout training for SVMs with data augmentation (2018)

1 2 3 next