Tensor2Tensor, or T2T for short, is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. T2T is actively used and maintained by researchers and engineers within the Google Brain team and a community of users. We’re eager to collaborate with you too, so feel free to open an issue on GitHub or send along a pull request (see our contribution doc). You can chat with us on Gitter and join the T2T Google Group.

References in zbMATH (referenced in 77 articles )

Showing results 1 to 20 of 77.
Sorted by year (citations)

1 2 3 4 next

  1. Cropper, Andrew; Dumančić, Sebastijan; Evans, Richard; Muggleton, Stephen H.: Inductive logic programming at 30 (2022)
  2. Dognin, Pierre; Melnyk, Igor; Mroueh, Youssef; Padhi, Inkit; Rigotti, Mattia; Ross, Jarret; Schiff, Yair; Young, Richard A.; Belgodere, Brian: Image captioning as an assistive technology: Lessons learned from VizWiz 2020 challenge (2022)
  3. Jagtap, N. V.; Mudunuru, M. K.; Nakshatrala, K. B.: A deep learning modeling framework to capture mixing patterns in reactive-transport systems (2022)
  4. Li, Chaofan; Ma, Kai: Entity recognition of Chinese medical text based on multi-head self-attention combined with BILSTM-CRF (2022)
  5. Liu, Ruibo; Jia, Chenyan; Wei, Jason; Xu, Guangxuan; Vosoughi, Soroush: Quantifying and alleviating political bias in language models (2022)
  6. Li, Zuchao; Zhou, Junru; Zhao, Hai; Zhang, Zhisong; Li, Haonan; Ju, Yuqi: Neural character-level syntactic parsing for Chinese (2022)
  7. Loureiro, Daniel; Mário Jorge, Alípio; Camacho-Collados, Jose: LMMS reloaded: transformer-based sense embeddings for disambiguation and beyond (2022)
  8. Lu, Yaojie; Lin, Hongyu; Tang, Jialong; Han, Xianpei; Sun, Le: End-to-end neural event coreference resolution (2022)
  9. Ras, Gabrielle; Xie, Ning; van Gerven, Marcel; Doran, Derek: Explainable deep learning: a field guide for the uninitiated (2022)
  10. Salcedo-Sanz, S.; Casillas-Pérez, D.; Del Ser, J.; Casanova-Mateo, C.; Cuadra, L.; Piles, M.; Camps-Valls, G.: Persistence in complex systems (2022)
  11. Škrlj, Blaž; Džeroski, Sašo; Lavrač, Nada; Petković, Matej: ReliefE: feature ranking in high-dimensional spaces via manifold embeddings (2022)
  12. Abbasimehr, Hossein; Paki, Reza: Prediction of COVID-19 confirmed cases combining deep learning methods and Bayesian optimization (2021)
  13. Adewoyin, Rilwan A.; Dueben, Peter; Watson, Peter; He, Yulan; Dutta, Ritabrata: TRU-NET: a deep learning approach to high resolution prediction of rainfall (2021)
  14. Bakhtin, Anton; Deng, Yuntian; Gross, Sam; Ott, Myle; Ranzato, Marc’aurelio; Szlam, Arthur: Residual energy-based models for text (2021)
  15. Bengio, Yoshua; Lodi, Andrea; Prouvost, Antoine: Machine learning for combinatorial optimization: a methodological tour d’horizon (2021)
  16. Chen, Chuangtao; He, Zhimin; Huang, Zhiming; Situ, Haozhen: Reconstructing a quantum state with a variational autoencoder (2021)
  17. Chen, Jiaoyan; Hu, Pan; Jimenez-Ruiz, Ernesto; Holter, Ole Magnus; Antonyrajah, Denvar; Horrocks, Ian: \textttOWL2Vec*: embedding of OWL ontologies (2021)
  18. Chung, Eric; Leung, Wing Tat; Pun, Sai-Mang; Zhang, Zecheng: A multi-stage deep learning based algorithm for multiscale model reduction (2021)
  19. Ding, Man; Han, Congying; Guo, Tiande: High generalization performance structured self-attention model for knapsack problem (2021)
  20. Evans, Richard; Bošnjak, Matko; Buesing, Lars; Ellis, Kevin; Pfau, David; Kohli, Pushmeet; Sergot, Marek: Making sense of raw input (2021)

1 2 3 4 next