Torch

Torch is a machine learning library written in C++ that works on most Unix/Linux platforms. It can be used to train MLPs, RBFs, HMMs, Gaussian Mixtures, Kmeans, Mixtures of experts, Parzen Windows, KNN, and can be easily extended so that you can add your own machine learning algorithms. (Source: http://freecode.com/)


References in zbMATH (referenced in 36 articles )

Showing results 1 to 20 of 36.
Sorted by year (citations)

1 2 next

  1. Vasilis Nikolaidis: The nnlib2 library and nnlib2Rcpp R package for implementing neural networks (2021) not zbMATH
  2. Boukaram, Wajih; Turkiyyah, George; Keyes, David: Randomized GPU algorithms for the construction of hierarchical matrices from matrix-vector operations (2019)
  3. Brown, Noam; Sandholm, Tuomas: Superhuman AI for multiplayer poker (2019)
  4. Edgar Riba, Dmytro Mishkin, Daniel Ponsa, Ethan Rublee, Gary Bradski: Kornia: an Open Source Differentiable Computer Vision Library for PyTorch (2019) arXiv
  5. Higham, Catherine F.; Higham, Desmond J.: Deep learning: an introduction for applied mathematicians (2019)
  6. van den Berg, E.: The Ocean Tensor Package (2019) not zbMATH
  7. Yeo, Kyongmin; Melnyk, Igor: Deep learning algorithm for data-driven simulation of noisy dynamical system (2019)
  8. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
  9. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  10. de Bruin, Tim; Kober, Jens; Tuyls, Karl; Babuška, Robert: Experience selection in deep reinforcement learning for control (2018)
  11. Gudivada, Venkat N.; Arbabifard, Kamyar: Open-source libraries, application frameworks, and workflow systems for NLP (2018)
  12. Helmbold, David P.; Long, Philip M.: Surprising properties of dropout in deep networks (2018)
  13. Hubara, Itay; Courbariaux, Matthieu; Soudry, Daniel; El-Yaniv, Ran; Bengio, Yoshua: Quantized neural networks: training neural networks with low precision weights and activations (2018)
  14. Francesco Giannini, Vincenzo Laveglia, Alessandro Rossi, Dario Zanca, Andrea Zugarini: Neural Networks for Beginners. A fast implementation in Matlab, Torch, TensorFlow (2017) arXiv
  15. Han Wang, Linfeng Zhang, Jiequn Han, Weinan E: DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics (2017) arXiv
  16. Hao Dong, Akara Supratak, Luo Mai, Fangde Liu, Axel Oehmichen, Simiao Yu, Yike Guo: TensorLayer: A Versatile Library for Efficient Deep Learning Development (2017) arXiv
  17. Orsini, Francesco; Frasconi, Paolo; De Raedt, Luc: kProbLog: an algebraic Prolog for machine learning (2017)
  18. Richard Wei, Vikram Adve, Lane Schwartz: DLVM: A modern compiler infrastructure for deep learning systems (2017) arXiv
  19. Diamond, Steven; Boyd, Stephen: Matrix-free convex optimization modeling (2016)
  20. Patrick Doetsch, Albert Zeyer, Paul Voigtlaender, Ilya Kulikov, Ralf Schlüter, Hermann Ney: RETURNN: The RWTH Extensible Training framework for Universal Recurrent Neural Networks (2016) arXiv

1 2 next