FADBAD++

FADBAD++ implements the forward, backward and Taylor methods utilizing C++ templates and operator overloading. These AD-templates enable the user to differentiate functions that are implemented in arithmetic types, such as doubles and intervals. One of the major ideas in FADBAD++ is that the AD-template types also behave like arithmetic types. This property of the AD-templates enables the user to differentiate a C++ function by replacing all occurrences of the original arithmetic type with the AD-template version. This transparency of behavior also makes it possible to generate high order derivatives by applying the AD-templates on themselves, enabling the user to combine the AD methods very easily.


References in zbMATH (referenced in 58 articles )

Showing results 1 to 20 of 58.
Sorted by year (citations)

1 2 3 next

  1. Schweidtmann, Artur M.; Bongartz, Dominik; Grothe, Daniel; Kerkenhoff, Tim; Lin, Xiaopeng; Najman, Jaromił; Mitsos, Alexander: Deterministic global optimization with Gaussian processes embedded (2021)
  2. Stordal, Andreas S.; Moraes, Rafael J.; Raanes, Patrick N.; Evensen, Geir: p-kernel Stein variational gradient descent for data assimilation and history matching (2021)
  3. Asaithambi, Asai: Solution of third grade thin film flow using algorithmic differentiation (2020)
  4. Bongartz, Dominik; Najman, Jaromił; Mitsos, Alexander: Deterministic global optimization of steam cycles using the IAPWS-IF97 model (2020)
  5. Ernsthausen, John M.; Nedialkov, Nedialko S.: Stepsize selection in the rigorous defect control of Taylor series methods (2020)
  6. Albu, Alla; Gorchakov, Andrei; Zubov, Vladimir: On the effectiveness of the fast automatic differentiation methodology (2019)
  7. Najman, Jaromił; Mitsos, Alexander: Tighter McCormick relaxations through subgradient propagation (2019)
  8. Schweidtmann, Artur M.; Mitsos, Alexander: Deterministic global optimization with artificial neural networks embedded (2019)
  9. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  10. Sagebaum, Max; Albring, T.; Gauger, N. R.: Expression templates for primal value taping in the reverse mode of algorithmic differentiation (2018)
  11. Siskind, Jeffrey Mark; Pearlmutter, Barak A.: Divide-and-conquer checkpointing for arbitrary programs with no user annotation (2018)
  12. Bongartz, Dominik; Mitsos, Alexander: Deterministic global optimization of process flowsheets in a reduced space using McCormick relaxations (2017)
  13. Guzman, Yannis A.; Faruque Hasan, M. M.; Floudas, Christodoulos A.: Performance of convex underestimators in a branch-and-bound framework (2016)
  14. Pérez-Galván, Carlos; Bogle, I. David L.: Dynamic global optimization methods for determining guaranteed solutions in chemical engineering (2016)
  15. Rauh, Andreas; Senkel, Luise; Aschemann, Harald; Saurin, Vasily V.; Kostin, Georgy V.: An integrodifferential approach to modeling, control, state estimation and optimization for heat transfer systems (2016)
  16. Rauh, Andreas; Senkel, Luise; Kersten, Julia; Aschemann, Harald: Reliable control of high-temperature fuel cell systems using interval-based sliding mode techniques (2016)
  17. Sluşanschi, Emil I.; Dumitrel, Vlad: ADiJaC -- automatic differentiation of Java classfiles (2016)
  18. Bartha, Ferenc A.; Munthe-Kaas, Hans Z.: Computing of B-series by automatic differentiation (2014)
  19. Cyranka, Jacek: Efficient and generic algorithm for rigorous integration forward in time of dPDEs. I (2014)
  20. Giunta, G.; Koutsawa, Y.; Belouettar, S.; Hu, H.: Analysis of nano-plates by atomistic-refined models accounting for surface free energy effect (2014)

1 2 3 next