Torch

Torch is a machine learning library written in C++ that works on most Unix/Linux platforms. It can be used to train MLPs, RBFs, HMMs, Gaussian Mixtures, Kmeans, Mixtures of experts, Parzen Windows, KNN, and can be easily extended so that you can add your own machine learning algorithms. (Source: http://freecode.com/)


References in zbMATH (referenced in 34 articles )

Showing results 1 to 20 of 34.
Sorted by year (citations)

1 2 next

  1. Boukaram, Wajih; Turkiyyah, George; Keyes, David: Randomized GPU algorithms for the construction of hierarchical matrices from matrix-vector operations (2019)
  2. Brown, Noam; Sandholm, Tuomas: Superhuman AI for multiplayer poker (2019)
  3. Edgar Riba, Dmytro Mishkin, Daniel Ponsa, Ethan Rublee, Gary Bradski: Kornia: an Open Source Differentiable Computer Vision Library for PyTorch (2019) arXiv
  4. Higham, Catherine F.; Higham, Desmond J.: Deep learning: an introduction for applied mathematicians (2019)
  5. van den Berg, E.: The Ocean Tensor Package (2019) not zbMATH
  6. Yeo, Kyongmin; Melnyk, Igor: Deep learning algorithm for data-driven simulation of noisy dynamical system (2019)
  7. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
  8. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  9. de Bruin, Tim; Kober, Jens; Tuyls, Karl; Babuška, Robert: Experience selection in deep reinforcement learning for control (2018)
  10. Gudivada, Venkat N.; Arbabifard, Kamyar: Open-source libraries, application frameworks, and workflow systems for NLP (2018)
  11. Helmbold, David P.; Long, Philip M.: Surprising properties of dropout in deep networks (2018)
  12. Hubara, Itay; Courbariaux, Matthieu; Soudry, Daniel; El-Yaniv, Ran; Bengio, Yoshua: Quantized neural networks: training neural networks with low precision weights and activations (2018)
  13. Francesco Giannini, Vincenzo Laveglia, Alessandro Rossi, Dario Zanca, Andrea Zugarini: Neural Networks for Beginners. A fast implementation in Matlab, Torch, TensorFlow (2017) arXiv
  14. Han Wang, Linfeng Zhang, Jiequn Han, Weinan E: DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics (2017) arXiv
  15. Hao Dong, Akara Supratak, Luo Mai, Fangde Liu, Axel Oehmichen, Simiao Yu, Yike Guo: TensorLayer: A Versatile Library for Efficient Deep Learning Development (2017) arXiv
  16. Orsini, Francesco; Frasconi, Paolo; De Raedt, Luc: kProbLog: an algebraic prolog for machine learning (2017)
  17. Richard Wei, Vikram Adve, Lane Schwartz: DLVM: A modern compiler infrastructure for deep learning systems (2017) arXiv
  18. Diamond, Steven; Boyd, Stephen: Matrix-free convex optimization modeling (2016)
  19. Patrick Doetsch, Albert Zeyer, Paul Voigtlaender, Ilya Kulikov, Ralf Schlüter, Hermann Ney: RETURNN: The RWTH Extensible Training framework for Universal Recurrent Neural Networks (2016) arXiv
  20. Ritambhara Singh, Jack Lanchantin, Gabriel Robins, Yanjun Qi: DeepChrome: Deep-learning for predicting gene expression from histone modifications (2016) arXiv

1 2 next