DiffSharp

DiffSharp: Automatic differentiation library. DiffSharp is a functional automatic differentiation (AD) library. AD allows exact and efficient calculation of derivatives, by systematically invoking the chain rule of calculus at the elementary operator level during program execution. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is affected by expression swell and cannot fully handle algorithmic control flow. Using the DiffSharp library, differentiation (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) is applied using higher-order functions, that is, functions which take other functions as arguments. Your functions can use the full expressive capability of the language including control flow. DiffSharp allows composition of differentiation using nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations. The library is developed by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth. DiffSharp is implemented in the F# language and can be used from C# and the other languages running on Mono, .NET Core, or the .Net Framework, targeting the 64 bit platform. It is tested on Linux and Windows. We are working on interfaces/ports to other languages.


References in zbMATH (referenced in 39 articles )

Showing results 1 to 20 of 39.
Sorted by year (citations)

1 2 next

  1. Angeli, Andrea; Desmet, Wim; Naets, Frank: Deep learning for model order reduction of multibody systems to minimal coordinates (2021)
  2. Bolte, Jérôme; Pauwels, Edouard: Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning (2021)
  3. Haghighat, Ehsan; Juanes, Ruben: SciANN: a keras/tensorflow wrapper for scientific computations and physics-informed deep learning using artificial neural networks (2021)
  4. Haghighat, Ehsan; Raissi, Maziar; Moure, Adrian; Gomez, Hector; Juanes, Ruben: A physics-informed deep learning framework for inversion and surrogate modeling in solid mechanics (2021)
  5. Iavernaro, F.; Mazzia, F.; Mukhametzhanov, M. S.; Sergeyev, Ya. D.: Computation of higher order Lie derivatives on the infinity computer (2021)
  6. Lu, Lu; Meng, Xuhui; Mao, Zhiping; Karniadakis, George Em: DeepXDE: a deep learning library for solving differential equations (2021)
  7. Ranade, Rishikesh; Hill, Chris; Pathak, Jay: Discretizationnet: a machine-learning based solver for Navier-Stokes equations using finite volume discretization (2021)
  8. Zhu, Qiming; Liu, Zeliang; Yan, Jinhui: Machine learning for metal additive manufacturing: predicting temperature and melt pool fluid dynamics using physics-informed neural networks (2021)
  9. Alarifi, Abdulaziz; Alwadain, Ayed: An optimized cognitive-assisted machine translation approach for natural language processing (2020)
  10. Deng, Hao; To, Albert C.: Topology optimization based on deep representation learning (DRL) for compliance and stress-constrained design (2020)
  11. Jagtap, Ameya D.; Kawaguchi, Kenji; Karniadakis, George Em: Adaptive activation functions accelerate convergence in deep and physics-informed neural networks (2020)
  12. Jagtap, Ameya D.; Kharazmi, Ehsan; Karniadakis, George Em: Conservative physics-informed neural networks on discrete domains for conservation laws: applications to forward and inverse problems (2020)
  13. Jo, Hyeontae; Son, Hwijae; Hwang, Hyung Ju; Kim, Eun Heui: Deep neural network approach to forward-inverse problems (2020)
  14. Karumuri, Sharmila; Tripathy, Rohit; Bilionis, Ilias; Panchal, Jitesh: Simulator-free solution of high-dimensional stochastic elliptic partial differential equations using deep neural networks (2020)
  15. Katrutsa, Alexandr; Daulbaev, Talgat; Oseledets, Ivan: Black-box learning of multigrid parameters (2020)
  16. Laue, Sören; Mitterreiter, Matthias; Giesen, Joachim: A simple and efficient tensor calculus for machine learning (2020)
  17. Mao, Zhiping; Jagtap, Ameya D.; Karniadakis, George Em: Physics-informed neural networks for high-speed flows (2020)
  18. Mohamed, Shakir; Rosca, Mihaela; Figurnov, Michael; Mnih, Andriy: Monte Carlo gradient estimation in machine learning (2020)
  19. Norman, Matthew; Larkin, Jeffrey: A holistic algorithmic approach to improving accuracy, robustness, and computational efficiency for atmospheric dynamics (2020)
  20. Peñuñuri, F.; Peón, R.; González-Sánchez, D.; Escalante Soberanis, M. A.: Dual numbers and automatic differentiation to efficiently compute velocities and accelerations (2020)

1 2 next