Tensor2Tensor, or T2T for short, is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. T2T is actively used and maintained by researchers and engineers within the Google Brain team and a community of users. We’re eager to collaborate with you too, so feel free to open an issue on GitHub or send along a pull request (see our contribution doc). You can chat with us on Gitter and join the T2T Google Group.

References in zbMATH (referenced in 31 articles )

Showing results 1 to 20 of 31.
Sorted by year (citations)

1 2 next

  1. Bakhtin, Anton; Deng, Yuntian; Gross, Sam; Ott, Myle; Ranzato, Marc’aurelio; Szlam, Arthur: Residual energy-based models for text (2021)
  2. Bengio, Yoshua; Lodi, Andrea; Prouvost, Antoine: Machine learning for combinatorial optimization: a methodological tour d’horizon (2021)
  3. Chung, Eric; Leung, Wing Tat; Pun, Sai-Mang; Zhang, Zecheng: A multi-stage deep learning based algorithm for multiscale model reduction (2021)
  4. Fan, Angela; Bhosale, Shruti; Schwenk, Holger; Ma, Zhiyi; El-Kishky, Ahmed; Goyal, Siddharth; Baines, Mandeep; Celebi, Onur; Wenzek, Guillaume; Chaudhary, Vishrav; Goyal, Naman; Birch, Tom; Liptchinsky, Vitaliy; Edunov, Sergey; Auli, Michael; Joulin, Armand: Beyond English-centric multilingual machine translation (2021)
  5. Hao, Jie; Zhu, William: Architecture self-attention mechanism: nonlinear optimization for neural architecture search (2021)
  6. Ivek, Tomislav; Vlah, Domagoj: BlackBox: generalizable reconstruction of extremal values from incomplete spatio-temporal data (2021)
  7. Kiermayer, Mark; Weiß, Christian: Grouping of contracts in insurance using neural networks (2021)
  8. Ma, Shaohui; Fildes, Robert: Retail sales forecasting with meta-learning (2021)
  9. Papamakarios, George; Nalisnick, Eric; Rezende, Danilo Jimenez; Mohamed, Shakir; Lakshminarayanan, Balaji: Normalizing flows for probabilistic modeling and inference (2021)
  10. Pérez, Jorge; Barceló, Pablo; Marinkovic, Javier: Attention is Turing-complete (2021)
  11. Yin, Yongjing; Lai, Shaopeng; Song, Linfeng; Zhou, Chulun; Han, Xianpei; Yao, Junfeng; Su, Jinsong: An external knowledge enhanced graph-based neural network for sentence ordering (2021)
  12. Arik, Sercan O.; Pfister, Tomas: ProtoAttend: attention-based prototypical learning (2020)
  13. Bloem-Reddy, Benjamin; Teh, Yee Whye: Probabilistic symmetries and invariant neural networks (2020)
  14. Frady, E. Paxon; Kent, Spencer J.; Olshausen, Bruno A.; Sommer, Friedrich T.: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures (2020)
  15. Geneva, Nicholas; Zabaras, Nicholas: Modeling the dynamics of PDE systems with physics-constrained deep auto-regressive networks (2020)
  16. Kazemi, Seyed Mehran; Goel, Rishab; Jain, Kshitij; Kobyzev, Ivan; Sethi, Akshay; Forsyth, Peter; Poupart, Pascal: Representation learning for dynamic graphs: a survey (2020)
  17. Lang, Xufeng; Sun, Zhengxing: Structure-aware shape correspondence network for 3D shape synthesis (2020)
  18. Sun, Ruo-Yu: Optimization for deep learning: an overview (2020)
  19. Tikhomirov, M. M.; Loukachevitch, N. V.; Dobrov, B. V.: Recognizing named entities in specific domain (2020)
  20. Wang, Chen; Xu, Li-yan; Fan, Jian-sheng: A general deep learning framework for history-dependent response prediction based on UA-Seq2Seq model (2020)

1 2 next