TensorFlow™ is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google’s Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.

References in zbMATH (referenced in 22 articles )

Showing results 1 to 20 of 22.
Sorted by year (citations)

1 2 next

  1. Akara Supratak, Hao Dong, Chao Wu, Yike Guo: DeepSleepNet: a Model for Automatic Sleep Stage Scoring based on Raw Single-Channel EEG (2017) arXiv
  2. Bart van Merrienboer, Alexander B. Wiltschko, Dan Moldovan: Tangent: Automatic Differentiation Using Source Code Transformation in Python (2017) arXiv
  3. Francesco Furiani, Giulio Martella, Alberto Paoluzzi: Geometric Computing with Chain Complexes: Design and Features of a Julia Package (2017) arXiv
  4. Francesco Giannini, Vincenzo Laveglia, Alessandro Rossi, Dario Zanca, Andrea Zugarini: Neural Networks for Beginners. A fast implementation in Matlab, Torch, TensorFlow (2017) arXiv
  5. Han Wang, Linfeng Zhang, Jiequn Han, Weinan E: DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics (2017) arXiv
  6. Hao Dong, Akara Supratak, Luo Mai, Fangde Liu, Axel Oehmichen, Simiao Yu, Yike Guo: TensorLayer: A Versatile Library for Efficient Deep Learning Development (2017) arXiv
  7. Haojin Yang, Martin Fritzsche, Christian Bartz, Christoph Meinel: BMXNet: An Open-Source Binary Neural Network Implementation Based on MXNet (2017) arXiv
  8. Jack Baker, Paul Fearnhead, Emily B. Fox, Christopher Nemeth: sgmcmc: An R Package for Stochastic Gradient Markov Chain Monte Carlo (2017) arXiv
  9. Jonas Rauber, Wieland Brendel, Matthias Bethge: Foolbox v0.8.0: A Python toolbox to benchmark the robustness of machine learning models (2017) arXiv
  10. Matthew Dixon, Diego Klabjan, Lan Wei: OSTSC: Over Sampling for Time Series Classification in R (2017) arXiv
  11. Matthews, Alexander G.De G.; van der Wilk, Mark; Nickson, Tom; Fujii, Keisuke; Boukouvalas, Alexis; León-Villagrá, Pablo; Ghahramani, Zoubin; Hensman, James: GPflow: a Gaussian process library using tensorflow (2017)
  12. Michael Ringgaard, Rahul Gupta, Fernando C. N. Pereira: SLING: A framework for frame semantic parsing (2017) arXiv
  13. Orsini, Francesco; Frasconi, Paolo; De Raedt, Luc: kProbLog: an algebraic prolog for machine learning (2017)
  14. Paul Springer, Tong Su, Paolo Bientinesi: HPTT: A High-Performance Tensor Transposition C++ Library (2017) arXiv
  15. Richard Wei, Vikram Adve, Lane Schwartz: DLVM: A modern compiler infrastructure for deep learning systems (2017) arXiv
  16. Ryan R. Curtin, Shikhar Bhardwaj, Marcus Edel, Yannis Mentekidis: A generic and fast C++ optimization framework (2017) arXiv
  17. Sak, Halis; Başoğlu, İsmail: Efficient randomized quasi-Monte Carlo methods for portfolio market risk (2017)
  18. Wei Wen, Cong Xu, Feng Yan, Chunpeng Wu, Yandan Wang, Yiran Chen, Hai Li: TernGrad: Ternary Gradients to Reduce Communication in Distributed Deep Learning (2017) arXiv
  19. Alexander G. Anderson, Cory P. Berg, Daniel P. Mossing, Bruno A. Olshausen: DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies (2016) arXiv
  20. Diamond, Steven; Boyd, Stephen: Matrix-free convex optimization modeling (2016)

1 2 next