neural-tangents
Neural Tangents: Fast and Easy Infinite Neural Networks in Python. Neural Tangents is a library designed to enable research into infinite-width neural networks. It provides a high-level API for specifying complex and hierarchical neural network architectures. These networks can then be trained and evaluated either at finite-width as usual or in their infinite-width limit. Infinite-width networks can be trained analytically using exact Bayesian inference or using gradient descent via the Neural Tangent Kernel. Additionally, Neural Tangents provides tools to study gradient descent training dynamics of wide but finite networks in either function space or weight space. The entire library runs out-of-the-box on CPU, GPU, or TPU. All computations can be automatically distributed over multiple accelerators with near-linear scaling in the number of devices. Neural Tangents is available at https://github.com/google/neural-tangents. We also provide an accompanying interactive Colab notebook.
Keywords for this software
References in zbMATH (referenced in 5 articles , 1 standard article )
Showing results 1 to 5 of 5.
Sorted by year (- Ghorbani, Behrooz; Mei, Song; Misiakiewicz, Theodor; Montanari, Andrea: When do neural networks outperform kernel methods? (2021)
- Li, Zhihan; Fan, Yuwei; Ying, Lexing: Multilevel fine-tuning: closing generalization gaps in approximation of solution maps under a limited budget for training data (2021)
- Lee, Jaehoon; Xiao, Lechao; Schoenholz, Samuel S.; Bahri, Yasaman; Novak, Roman; Sohl-Dickstein, Jascha; Pennington, Jeffrey: Wide neural networks of any depth evolve as linear models under gradient descent (2020)
- Sun, Ruo-Yu: Optimization for deep learning: an overview (2020)
- Roman Novak, Lechao Xiao, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz: Neural Tangents: Fast and Easy Infinite Neural Networks in Python (2019) arXiv