ForwardDiff

Forward-Mode Automatic Differentiation in Julia. We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support for higher-order differentiation and differentiation using custom number types (including complex numbers). For gradient and Jacobian calculations, ForwardDiff provides a variant of vector-forward mode that avoids expensive heap allocation and makes better use of memory bandwidth than traditional vector mode. In our numerical experiments, we demonstrate that for nontrivially large dimensions, ForwardDiff’s gradient computations can be faster than a reverse-mode implementation from the Python-based autograd package. We also illustrate how ForwardDiff is used effectively within JuMP, a modeling language for optimization. According to our usage statistics, 41 unique repositories on GitHub depend on ForwardDiff, with users from diverse fields such as astronomy, optimization, finite element analysis, and statistics. This document is an extended abstract that has been accepted for presentation at the AD2016 7th International Conference on Algorithmic Differentiation.


References in zbMATH (referenced in 19 articles , 1 standard article )

Showing results 1 to 19 of 19.
Sorted by year (citations)

  1. Arnaudon, Alexis; van der Meulen, Frank; Schauer, Moritz; Sommer, Stefan: Diffusion bridges for stochastic Hamiltonian systems and shape evolutions (2022)
  2. Chan, Jesse; Taylor, Christina G.: Efficient computation of Jacobian matrices for entropy stable summation-by-parts schemes (2022)
  3. Jamie Fairbrother, Christopher Nemeth, Maxime Rischard, Johanni Brea, Thomas Pinder: GaussianProcesses.jl: A Nonparametric Bayes Package for the Julia Language (2022) not zbMATH
  4. Oseledets, Ivan; Fanaskov, Vladimir: Direct optimization of BPX preconditioners (2022)
  5. Taylor McDonnell; Andrew Ning: GXBeam: A Pure Julia Implementation of Geometrically Exact Beam Theory (2022) not zbMATH
  6. Mathieu Besançon, Theodore Papamarkou, David Anthoff, Alex Arslan, Simon Byrne, Dahua Lin, John Pearson: Distributions.jl: Definition and Modeling of Probability Distributions in the JuliaStats Ecosystem (2021) not zbMATH
  7. Papamarkou, Theodore; Lindo, Alexey; Ford, Eric B.: Geometric adaptive Monte Carlo in random environment (2021)
  8. Xu, Kailai; Tartakovsky, Alexandre M.; Burghardt, Jeff; Darve, Eric: Learning viscoelasticity models from indirect data using deep neural networks (2021)
  9. Cancès, Clément; Hillairet, Claire Chainais; Fuhrmann, Jürgen; Gaudeul, Benoît: On four numerical schemes for a unipolar degenerate drift-diffusion model (2020)
  10. Després, Bruno; Ancellin, Matthieu: A functional equation with polynomial solutions and application to neural networks (2020)
  11. Milz, Johannes; Ulbrich, Michael: An approximation scheme for distributionally robust nonlinear optimization (2020)
  12. Orban, Dominique; Siqueira, Abel Soares: A regularization method for constrained nonlinear least squares (2020)
  13. Emerson V. Castelani; Ronaldo Lopes; Wesley V. I. Shirabayashi; Francisco N. C. Sobral: RAFF.jl: Robust Algebraic Fitting Function in Julia (2019) not zbMATH
  14. Tim Besard, Valentin Churavy, Alan Edelman, Bjorn De Sutter: Rapid software prototyping for heterogeneous and distributed platforms (2019) not zbMATH
  15. Wormell, Caroline: Spectral Galerkin methods for transfer operators in uniformly expanding dynamics (2019)
  16. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  17. Schmitt, Jeremy; Shingel, Tatiana; Leok, Melvin: Lagrangian and Hamiltonian Taylor variational integrators (2018)
  18. Dunning, Iain; Huchette, Joey; Lubin, Miles: JuMP: a modeling language for mathematical optimization (2017)
  19. Jarrett Revels, Miles Lubin, Theodore Papamarkou: Forward-Mode Automatic Differentiation in Julia (2016) arXiv