Lasagne
Lasagne is a lightweight library to build and train neural networks in Theano. Its main features are: Supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof. Allows architectures of multiple inputs and multiple outputs, including auxiliary classifiers. Many optimization methods including Nesterov momentum, RMSprop and ADAM. Freely definable cost function and no need to derive gradients due to Theano’s symbolic differentiation. Transparent support of CPUs and GPUs due to Theano’s expression compiler
Keywords for this software
References in zbMATH (referenced in 7 articles )
Showing results 1 to 7 of 7.
Sorted by year (- Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
- Hubara, Itay; Courbariaux, Matthieu; Soudry, Daniel; El-Yaniv, Ran; Bengio, Yoshua: Quantized neural networks: training neural networks with low precision weights and activations (2018)
- Hao Dong, Akara Supratak, Luo Mai, Fangde Liu, Axel Oehmichen, Simiao Yu, Yike Guo: TensorLayer: A Versatile Library for Efficient Deep Learning Development (2017) arXiv
- Jiaxin Shi, Jianfei Chen, Jun Zhu, Shengyang Sun, Yucen Luo, Yihong Gu, Yuhao Zhou: ZhuSuan: A Library for Bayesian Deep Learning (2017) arXiv
- Jonas Rauber, Wieland Brendel, Matthias Bethge: Foolbox v0.8.0: A Python toolbox to benchmark the robustness of machine learning models (2017) arXiv
- Dustin Tran, Alp Kucukelbir, Adji B. Dieng, Maja Rudolph, Dawen Liang, David M. Blei: Edward: A library for probabilistic modeling, inference, and criticism (2016) arXiv
- Patrick Doetsch, Albert Zeyer, Paul Voigtlaender, Ilya Kulikov, Ralf Schlüter, Hermann Ney: RETURNN: The RWTH Extensible Training framework for Universal Recurrent Neural Networks (2016) arXiv