Automatic Variational Inference in Stan. Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it difficult to automate. We propose an automatic variational inference algorithm, automatic differentiation variational inference (ADVI). The user only provides a Bayesian model and a dataset; nothing else. We make no conjugacy assumptions and support a broad class of models. The algorithm automatically determines an appropriate variational family and optimizes the variational objective. We implement ADVI in Stan (code available now), a probabilistic programming framework. We compare ADVI to MCMC sampling across hierarchical generalized linear models, nonconjugate matrix factorization, and a mixture model. We train the mixture model on a quarter million images. With ADVI we can use variational inference on any model we write in Stan.

References in zbMATH (referenced in 27 articles , 2 standard articles )

Showing results 1 to 20 of 27.
Sorted by year (citations)

1 2 next

  1. Kelter, Riko: Bayesian model selection in the (\mathcalM)-open setting -- approximate posterior inference and subsampling for efficient large-scale leave-one-out cross-validation via the difference estimator (2021)
  2. Nemeth, Christopher; Fearnhead, Paul: Stochastic gradient Markov chain Monte Carlo (2021)
  3. Andrade, Daniel; Takeda, Akiko; Fukumizu, Kenji: Robust Bayesian model selection for variable clustering with the Gaussian graphical model (2020)
  4. Drori, Iddo: Deep variational inference (2020)
  5. Karimi, Belhal; Lavielle, Marc; Moulines, Eric: f-SAEM: a fast stochastic approximation of the EM algorithm for nonlinear mixed effects models (2020)
  6. Nguyen, Hoang; Ausín, M. Concepción; Galeano, Pedro: Variational inference for high dimensional structured factor copulas (2020)
  7. Nott, David J.; Wang, Xueou; Evans, Michael; Englert, Berthold-Georg: Checking for prior-data conflict using prior-to-posterior divergences (2020)
  8. Pan, Shaowu; Duraisamy, Karthik: Physics-informed probabilistic learning of linear embeddings of nonlinear dynamics with guaranteed stability (2020)
  9. René, Alexandre; Longtin, André; Macke, Jakob H.: Inference of a mesoscopic population model from population spike trains (2020)
  10. Saha, Abhijoy; Bharath, Karthik; Kurtek, Sebastian: A geometric variational approach to Bayesian inference (2020)
  11. Ye, Lifeng; Beskos, Alexandros; De Iorio, Maria; Hao, Jie: Monte Carlo co-ordinate ascent variational inference (2020)
  12. Yu, Hang; Wu, Songwei; Xin, Luyin; Dauwels, Justin: Fast Bayesian inference of sparse networks with automatic sparsity determination (2020)
  13. Azzimonti, Laura; Corani, Giorgio; Zaffalon, Marco: Hierarchical estimation of parameters in Bayesian networks (2019)
  14. Barajas-Solano, D. A.; Tartakovsky, A. M.: Approximate Bayesian model inversion for PDEs with heterogeneous and state-dependent coefficients (2019)
  15. McLean, M. W.; Wand, M. P.: Variational message passing for elaborate response regression models (2019)
  16. Wang, Yixin; Blei, David M.: Frequentist consistency of variational Bayes (2019)
  17. Wang, Yixin; Blei, David M.: The blessings of multiple causes (2019)
  18. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  19. Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
  20. MacNab, Ying C.: Rejoinder on: “Some recent work on multivariate Gaussian Markov random fields” (2018)

1 2 next