ANODEs
Augmented Neural ODEs. We show that Neural Ordinary Differential Equations (ODEs) learn representations that preserve the topology of the input space and prove that this implies the existence of functions Neural ODEs cannot represent. To address these limitations, we introduce Augmented Neural ODEs which, in addition to being more expressive models, are empirically more stable, generalize better and have a lower computational cost than Neural ODEs.
Keywords for this software
References in zbMATH (referenced in 8 articles )
Showing results 1 to 8 of 8.
Sorted by year (- Avelin, Benny; Nyström, Kaj: Neural ODEs as the deep limit of ResNets with constant weights (2021)
- Celledoni, E.; Ehrhardt, M. J.; Etmann, C.; McLachlan, R. I.; Owren, B.; Schonlieb, C.-B.; Sherry, F.: Structure-preserving deep learning (2021)
- Giesecke, Elisa; Kröner, Axel: Classification with Runge-Kutta networks and feature space augmentation (2021)
- Li, Jingshi; Chen, Song; Cao, Yanzhao; Sun, Zhao: A neural network approach to sampling based learning control for quantum system with uncertainty (2021)
- Papamakarios, George; Nalisnick, Eric; Rezende, Danilo Jimenez; Mohamed, Shakir; Lakshminarayanan, Balaji: Normalizing flows for probabilistic modeling and inference (2021)
- Roesch, Elisabeth; Rackauckas, Christopher; Stumpf, Michael P. H.: Collocation based training of neural ordinary differential equations (2021)
- Cuchiero, Christa; Larsson, Martin; Teichmann, Josef: Deep neural networks, generic universal interpolation, and controlled ODEs (2020)
- Michael Poli, Stefano Massaroli, Atsushi Yamashita, Hajime Asama, Jinkyoo Park: TorchDyn: A Neural Differential Equations Library (2020) arXiv