Pyro is a universal probabilistic programming language (PPL) written in Python and supported by PyTorch on the backend. Pyro enables flexible and expressive deep probabilistic modeling, unifying the best of modern deep learning and Bayesian modeling. It was designed with these key principles: Universal: Pyro can represent any computable probability distribution. Scalable: Pyro scales to large data sets with little overhead. Minimal: Pyro is implemented with a small core of powerful, composable abstractions. Flexible: Pyro aims for automation when you want it, control when you need it.

References in zbMATH (referenced in 15 articles )

Showing results 1 to 15 of 15.
Sorted by year (citations)

  1. Stankovič, Miroslav; Bartocci, Ezio; Kovács, Laura: Moment-based analysis of Bayesian network properties (2022)
  2. Lukas Prediger, Niki Loppi, Samuel Kaski, Antti Honkela: d3p - A Python Package for Differentially-Private Probabilistic Programming (2021) arXiv
  3. Mathieu Besançon, Theodore Papamarkou, David Anthoff, Alex Arslan, Simon Byrne, Dahua Lin, John Pearson: Distributions.jl: Definition and Modeling of Probability Distributions in the JuliaStats Ecosystem (2021) not zbMATH
  4. Oseledets, I. V.; Kharyuk, P. V.: Structuring data with block term decomposition: decomposition of joint tensors and variational block term decomposition as a parametrized mixture distribution model (2021)
  5. Alexander M. Rush: Torch-Struct: Deep Structured Prediction Library (2020) arXiv
  6. Alexandrov, Alexander; Benidis, Konstantinos; Bohlke-Schneider, Michael; Flunkert, Valentin; Gasthaus, Jan; Januschowski, Tim; Maddix, Danielle C.; Rangapuram, Syama; Salinas, David; Schulz, Jasper; Stella, Lorenzo; Türkmen, Ali Caner; Wang, Yuyang: GluonTS: probabilistic and neural time series modeling in Python (2020)
  7. Brehmer, Johann; Louppe, Gilles; Pavez, Juan; Cranmer, Kyle: Mining gold from implicit models to improve likelihood-free inference (2020)
  8. Bürkner, Paul-Christian; Gabry, Jonah; Vehtari, Aki: Approximate leave-future-out cross-validation for Bayesian time series models (2020)
  9. Drori, Iddo: Deep variational inference (2020)
  10. Hillerström, Daniel; Lindley, Sam; Atkey, Robert: Effect handlers via generalised continuations (2020)
  11. Lüdtke, Stefan; Kirste, Thomas: Lifted Bayesian filtering in multiset rewriting systems (2020)
  12. Mario Morvan, Angelos Tsiaras, Nikolaos Nikolaou, Ingo P. Waldmann: PyLightcurve-torch: a transit modelling package for deep learning applications in PyTorch (2020) arXiv
  13. Du Phan, Neeraj Pradhan, Martin Jankowiak: Composable Effects for Flexible and Accelerated Probabilistic Programming in NumPyro (2019) arXiv
  14. Kumar, R.; Colin, C.; Hartikainen, A.; Martin, O. A.: ArviZ a unified library for exploratory analysis of Bayesian models in Python. (2019) not zbMATH
  15. Guillaume Baudart, Martin Hirzel, Kiran Kate, Louis Mandel, Avraham Shinnar: Yaps: Python Frontend to Stan (2018) arXiv