Autograd

Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory.


References in zbMATH (referenced in 24 articles )

Showing results 1 to 20 of 24.
Sorted by year (citations)

1 2 next

  1. Curtin, Ryan R.; Edel, Marcus; Prabhu, Rahul Ganesh; Basak, Suryoday; Lou, Zhihao; Sanderson, Conrad: The ensmallen library for flexible numerical optimization (2021)
  2. Derryn Knife: SurPyval: Survival Analysis with Python (2021) not zbMATH
  3. Schoenholz, Samuel S.; Cubuk, Ekin D.: JAX, M.D. a framework for differentiable physics (2021)
  4. Vlassis, Nikolaos N.; Sun, WaiChing: Sobolev training of thermodynamic-informed neural networks for interpretable elasto-plasticity models with level set hardening (2021)
  5. Julian Blank, Kalyanmoy Deb: pymoo: Multi-objective Optimization in Python (2020) arXiv
  6. Kamm, Jack; Terhorst, Jonathan; Durbin, Richard; Song, Yun S.: Efficiently inferring the demographic history of many populations with allele count data (2020)
  7. Katrutsa, Alexandr; Daulbaev, Talgat; Oseledets, Ivan: Black-box learning of multigrid parameters (2020)
  8. Laue, Sören; Mitterreiter, Matthias; Giesen, Joachim: A simple and efficient tensor calculus for machine learning (2020)
  9. Lee, Jaehoon; Xiao, Lechao; Schoenholz, Samuel S.; Bahri, Yasaman; Novak, Roman; Sohl-Dickstein, Jascha; Pennington, Jeffrey: Wide neural networks of any depth evolve as linear models under gradient descent (2020)
  10. R. Adhikari, Austen Bolitho, Fernando Caballero, Michael E. Cates, Jakub Dolezal, Timothy Ekeh, Jules Guioth, Robert L. Jack, Julian Kappler, Lukas Kikuchi, Hideki Kobayashi, Yuting I. Li, Joseph D. Peterson, Patrick Pietzonka, Benjamin Remez, Paul B. Rohrbach, Rajesh Singh, Günther Turk: Inference, prediction and optimization of non-pharmaceutical interventions using compartment models: the PyRoss library (2020) arXiv
  11. Blanchard, Antoine; Sapsis, Themistoklis P.: Learning the tangent space of dynamical instabilities from data (2019)
  12. Daniel Smilkov, Nikhil Thorat, Yannick Assogba, Ann Yuan, Nick Kreeger, Ping Yu, Kangyi Zhang, Shanqing Cai, Eric Nielsen, David Soergel, Stan Bileschi, Michael Terry, Charles Nicholson, Sandeep N. Gupta, Sarah Sirajuddin, D. Sculley, Rajat Monga, Greg Corrado, Fernanda B. Viegas, Martin Wattenberg: TensorFlow.js: Machine Learning for the Web and Beyond (2019) arXiv
  13. Ghosh, Soumya; Yao, Jiayu; Doshi-Velez, Finale: Model selection in Bayesian neural networks via horseshoe priors (2019)
  14. Masood, Muhammad A.; Doshi-Velez, Finale: A particle-based variational approach to Bayesian non-negative matrix factorization (2019)
  15. Oates, Chris J.; Cockayne, Jon; Briol, François-Xavier; Girolami, Mark: Convergence rates for a class of estimators based on Stein’s method (2019)
  16. Dan Moldovan, James M Decker, Fei Wang, Andrew A Johnson, Brian K Lee, Zachary Nado, D Sculley, Tiark Rompf, Alexander B Wiltschko: AutoGraph: Imperative-style Coding with Graph-based Performance (2018) arXiv
  17. Giordano, Ryan; Broderick, Tamara; Jordan, Michael I.: Covariances, robustness, and variational Bayes (2018)
  18. Giraldi, Loïc; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.: Optimal projection of observations in a Bayesian setting (2018)
  19. Shikhar Bhardwaj, Ryan R. Curtin, Marcus Edel, Yannis Mentekidis, Conrad Sanderson: ensmallen: a flexible C++ library for efficient function optimization (2018) arXiv
  20. Srajer, Filip; Kukelova, Zuzana; Fitzgibbon, Andrew: A benchmark of selected algorithmic differentiation tools on some problems in computer vision and machine learning (2018)

1 2 next