ADVI

Automatic Variational Inference in Stan. Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it difficult to automate. We propose an automatic variational inference algorithm, automatic differentiation variational inference (ADVI). The user only provides a Bayesian model and a dataset; nothing else. We make no conjugacy assumptions and support a broad class of models. The algorithm automatically determines an appropriate variational family and optimizes the variational objective. We implement ADVI in Stan (code available now), a probabilistic programming framework. We compare ADVI to MCMC sampling across hierarchical generalized linear models, nonconjugate matrix factorization, and a mixture model. We train the mixture model on a quarter million images. With ADVI we can use variational inference on any model we write in Stan.


References in zbMATH (referenced in 23 articles , 2 standard articles )

Showing results 1 to 20 of 23.
Sorted by year (citations)

1 2 next

  1. Andrade, Daniel; Takeda, Akiko; Fukumizu, Kenji: Robust Bayesian model selection for variable clustering with the Gaussian graphical model (2020)
  2. Drori, Iddo: Deep variational inference (2020)
  3. Karimi, Belhal; Lavielle, Marc; Moulines, Eric: f-SAEM: a fast stochastic approximation of the EM algorithm for nonlinear mixed effects models (2020)
  4. Pan, Shaowu; Duraisamy, Karthik: Physics-informed probabilistic learning of linear embeddings of nonlinear dynamics with guaranteed stability (2020)
  5. René, Alexandre; Longtin, André; Macke, Jakob H.: Inference of a mesoscopic population model from population spike trains (2020)
  6. Saha, Abhijoy; Bharath, Karthik; Kurtek, Sebastian: A geometric variational approach to Bayesian inference (2020)
  7. Ye, Lifeng; Beskos, Alexandros; De Iorio, Maria; Hao, Jie: Monte Carlo co-ordinate ascent variational inference (2020)
  8. Yu, Hang; Wu, Songwei; Xin, Luyin; Dauwels, Justin: Fast Bayesian inference of sparse networks with automatic sparsity determination (2020)
  9. Azzimonti, Laura; Corani, Giorgio; Zaffalon, Marco: Hierarchical estimation of parameters in Bayesian networks (2019)
  10. Cox, Marco; van de Laar, Thijs; de Vries, Bert: A factor graph approach to automated design of Bayesian signal processing algorithms (2019)
  11. McLean, M. W.; Wand, M. P.: Variational message passing for elaborate response regression models (2019)
  12. Wang, Yixin; Blei, David M.: The blessings of multiple causes (2019)
  13. Wang, Yixin; Blei, David M.: Frequentist consistency of variational Bayes (2019)
  14. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  15. Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
  16. MacNab, Ying C.: Rejoinder on: “Some recent work on multivariate Gaussian Markov random fields” (2018)
  17. Ong, Victor M. H.; Nott, David J.; Tran, Minh-Ngoc; Sisson, Scott A.; Drovandi, Christopher C.: Variational Bayes with synthetic likelihood (2018)
  18. Srivastava, Sanvesh; Li, Cheng; Dunson, David B.: Scalable Bayes via barycenter in Wasserstein space (2018)
  19. Tan, Linda S. L.; Nott, David J.: Gaussian variational approximation with sparse precision matrices (2018)
  20. Yao, Yuling; Vehtari, Aki; Simpson, Daniel; Gelman, Andrew: Using stacking to average Bayesian predictive distributions (with discussion) (2018)

1 2 next