Torch is a machine learning library written in C++ that works on most Unix/Linux platforms. It can be used to train MLPs, RBFs, HMMs, Gaussian Mixtures, Kmeans, Mixtures of experts, Parzen Windows, KNN, and can be easily extended so that you can add your own machine learning algorithms. (Source:

References in zbMATH (referenced in 28 articles )

Showing results 1 to 20 of 28.
Sorted by year (citations)

1 2 next

  1. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
  2. Baydin, Atılım Güneş; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark: Automatic differentiation in machine learning: a survey (2018)
  3. de Bruin, Tim; Kober, Jens; Tuyls, Karl; Babuška, Robert: Experience selection in deep reinforcement learning for control (2018)
  4. Gudivada, Venkat N.; Arbabifard, Kamyar: Open-source libraries, application frameworks, and workflow systems for NLP (2018)
  5. Helmbold, David P.; Long, Philip M.: Surprising properties of dropout in deep networks (2018)
  6. Hubara, Itay; Courbariaux, Matthieu; Soudry, Daniel; El-Yaniv, Ran; Bengio, Yoshua: Quantized neural networks: training neural networks with low precision weights and activations (2018)
  7. Francesco Giannini, Vincenzo Laveglia, Alessandro Rossi, Dario Zanca, Andrea Zugarini: Neural Networks for Beginners. A fast implementation in Matlab, Torch, TensorFlow (2017) arXiv
  8. Han Wang, Linfeng Zhang, Jiequn Han, Weinan E: DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics (2017) arXiv
  9. Hao Dong, Akara Supratak, Luo Mai, Fangde Liu, Axel Oehmichen, Simiao Yu, Yike Guo: TensorLayer: A Versatile Library for Efficient Deep Learning Development (2017) arXiv
  10. Orsini, Francesco; Frasconi, Paolo; De Raedt, Luc: kProbLog: an algebraic prolog for machine learning (2017)
  11. Richard Wei, Vikram Adve, Lane Schwartz: DLVM: A modern compiler infrastructure for deep learning systems (2017) arXiv
  12. Diamond, Steven; Boyd, Stephen: Matrix-free convex optimization modeling (2016)
  13. Patrick Doetsch, Albert Zeyer, Paul Voigtlaender, Ilya Kulikov, Ralf Schlüter, Hermann Ney: RETURNN: The RWTH Extensible Training framework for Universal Recurrent Neural Networks (2016) arXiv
  14. Ritambhara Singh, Jack Lanchantin, Gabriel Robins, Yanjun Qi: DeepChrome: Deep-learning for predicting gene expression from histone modifications (2016) arXiv
  15. Žbontar, Jure; Lecun, Yann: Stereo matching by training a convolutional neural network to compare image patches (2016)
  16. Doermann, David (ed.); Tombre, Karl (ed.): Handbook of document image processing and recognition (2014)
  17. Mesnil, Grégoire; Bordes, Antoine; Weston, Jason; Chechik, Gal; Bengio, Yoshua: Learning semantic representations of objects and their parts (2014)
  18. Hazrati Fard, Seyed Mehdi; Hamzeh, Ali; Hashemi, Sattar: Using reinforcement learning to find an optimal set of features (2013)
  19. Kovacs, Tim; Egginton, Robert: On the analysis and design of software for reinforcement learning, with a survey of existing systems (2011) ioport
  20. Jin, Xiao-Bo; Liu, Cheng-Lin; Hou, Xinwen: Regularized margin-based conditional log-likelihood loss for prototype learning (2010)

1 2 next