Tensor2Tensor, or T2T for short, is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. T2T is actively used and maintained by researchers and engineers within the Google Brain team and a community of users. We’re eager to collaborate with you too, so feel free to open an issue on GitHub or send along a pull request (see our contribution doc). You can chat with us on Gitter and join the T2T Google Group.

References in zbMATH (referenced in 20 articles )

Showing results 1 to 20 of 20.
Sorted by year (citations)

  1. Hao, Jie; Zhu, William: Architecture self-attention mechanism: nonlinear optimization for neural architecture search (2021)
  2. Yin, Yongjing; Lai, Shaopeng; Song, Linfeng; Zhou, Chulun; Han, Xianpei; Yao, Junfeng; Su, Jinsong: An external knowledge enhanced graph-based neural network for sentence ordering (2021)
  3. Arik, Sercan O.; Pfister, Tomas: ProtoAttend: attention-based prototypical learning (2020)
  4. Bloem-Reddy, Benjamin; Teh, Yee Whye: Probabilistic symmetries and invariant neural networks (2020)
  5. Frady, E. Paxon; Kent, Spencer J.; Olshausen, Bruno A.; Sommer, Friedrich T.: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures (2020)
  6. Geneva, Nicholas; Zabaras, Nicholas: Modeling the dynamics of PDE systems with physics-constrained deep auto-regressive networks (2020)
  7. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  8. Lang, Xufeng; Sun, Zhengxing: Structure-aware shape correspondence network for 3D shape synthesis (2020)
  9. Sun, Ruo-Yu: Optimization for deep learning: an overview (2020)
  10. Tikhomirov, M. M.; Loukachevitch, N. V.; Dobrov, B. V.: Recognizing named entities in specific domain (2020)
  11. Wang, Shirui; Zhou, Wenan; Jiang, Chao: A survey of word embeddings based on deep learning (2020)
  12. Ye, Han-Jia; Sheng, Xiang-Rong; Zhan, De-Chuan: Few-shot learning with adaptively initialized task optimizer: a practical meta-learning approach (2020)
  13. Chen, Shun; Ge, Lei: Exploring the attention mechanism in LSTM-based Hong Kong stock price movement prediction (2019)
  14. Ismail Fawaz, Hassan; Forestier, Germain; Weber, Jonathan; Idoumghar, Lhassane; Muller, Pierre-Alain: Deep learning for time series classification: a review (2019)
  15. Su, Jinsong; Zhang, Xiangwen; Lin, Qian; Qin, Yue; Yao, Junfeng; Liu, Yang: Exploiting reverse target-side contexts for neural machine translation via asynchronous bidirectional decoding (2019)
  16. Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, Jamie Brew: HuggingFace’s Transformers: State-of-the-art Natural Language Processing (2019) arXiv
  17. Albert Zeyer, Tamer Alkhouli, Hermann Ney: RETURNN as a Generic Flexible Neural Toolkit with Application to Translation and Speech Recognition (2018) arXiv
  18. Oleksii Kuchaiev; Boris Ginsburg; Igor Gitman; Vitaly Lavrukhin; Carl Case; Paulius Micikevicius: OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models (2018) arXiv
  19. Xiaolin Wang; Masao Utiyama; Eiichiro Sumita: CytonMT: an Efficient Neural Machine Translation Open-source Toolkit Implemented in C++ (2018) arXiv
  20. Zhiting Hu; Haoran Shi; Zichao Yang; Bowen Tan; Tiancheng Zhao; Junxian He; Wentao Wang; Xingjiang Yu; Lianhui Qin; Di Wang; Xuezhe Ma; Hector Liu; Xiaodan Liang; Wanrong Zhu; Devendra Singh Sachan; Eric P. Xing: Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation (2018) arXiv