• Transformers

  • Referenced in 25 articles [sw30739]
  • RoBERTa, XLM, DistilBert, XLNet, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation...
  • Mozart

  • Referenced in 20 articles [sw04759]
  • assistants, as well as applications in natural language understanding and knowledge representation, in scheduling...
  • GLUE

  • Referenced in 8 articles [sw30755]
  • General Language Understanding Evaluation (GLUE) benchmark is a collection ... resources for training, evaluating, and analyzing natural language understanding systems. GLUE consists of: A benchmark ... nine sentence- or sentence-pair language understanding tasks built on established existing datasets and selected ... development of general and robust natural language understanding systems...
  • MultiNet

  • Referenced in 10 articles [sw01583]
  • representation system and their relation to natural language understanding and knowledge processing are shown...
  • Grammar Matrix

  • Referenced in 10 articles [sw21659]
  • parses and semantic representations necessary for natural language understanding...
  • Flyspeck

  • Referenced in 124 articles [sw10277]
  • developments together. The platform supports writing natural language `narratives’ that include islands of formal text ... system significantly lowers the threshold for understanding formal development and facilitates collaboration on informal...
  • ELECTRA

  • Referenced in 4 articles [sw37747]
  • more compute) on the GLUE natural language understanding benchmark. Our approach also works well...
  • BERT

  • Referenced in 118 articles [sw30756]
  • Bidirectional Transformers for Language Understanding. We introduce a new language representation model called BERT, which ... Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed ... tasks, such as question answering and language inference, without substantial task-specific architecture modifications. BERT ... state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE...
  • TinyBERT

  • Referenced in 3 articles [sw32564]
  • TinyBERT: Distilling BERT for Natural Language Understanding. Language model pre-training, such as BERT ... significantly improved the performances of many natural language processing tasks. However, pre-trained language models...
  • XGLUE

  • Referenced in 1 article [sw40136]
  • Benchmark Dataset for Cross-lingual Pre-training, Understanding and Generation. In this paper, we introduce ... which is labeled in English for natural language understanding tasks only, XGLUE has two main ... diversified tasks that cover both natural language understanding and generation scenarios; (2) for each task...
  • Multilisp

  • Referenced in 33 articles [sw09420]
  • toward symbolic computation. Unlike some parallel programming languages, Multilisp incorporates constructs for causing side effects ... parallel context is mitigated by the nature of the parallelism constructs and by support ... followed, should lead to highly parallel, easily understandable programs...
  • StructBERT

  • Referenced in 1 article [sw42711]
  • training for Deep Language Understanding. Recently, the pre-trained language model, BERT (and its robustly ... attracted a lot of attention in natural language understanding (NLU), and achieved state ... tasks, such as sentiment classification, natural language inference, semantic textual similarity and question answering. Inspired ... adapted to different levels of language understanding required by downstream tasks. The StructBERT with structural...
  • BWIBots

  • Referenced in 2 articles [sw29447]
  • understand human commands given in natural language, and (d) understand human intention from afar...
  • XLNet

  • Referenced in 20 articles [sw31118]
  • XLNet: Generalized Autoregressive Pretraining for Language Understanding. With the capability of modeling bidirectional contexts, denoising ... performance than pretraining approaches based on autoregressive language modeling. However, relying on corrupting the input ... results on 18 tasks including question answering, natural language inference, sentiment analysis, and document ranking...
  • PIQA

  • Referenced in 1 article [sw42137]
  • pose a challenge to today’s natural language understanding systems. While recent pretrained models (such...
  • Zanzibar OpenIVR

  • Referenced in 1 article [sw21455]
  • integration of the components, dialog management, natural language understanding. It is designed to work over...
  • Graph4Code

  • Referenced in 1 article [sw33949]
  • diverse applications in semantic search and natural language understanding. Graph4Code is a knowledge graph about...
  • DISCOS

  • Referenced in 1 article [sw42598]
  • crucial for artificial intelligence systems to understand natural language. Previous commonsense knowledge acquisition approaches typically...
  • ToD-BERT

  • Referenced in 1 article [sw42257]
  • BERT: Pre-trained Natural Language Understanding for Task-Oriented Dialogue. The underlying difference of linguistic...
  • ColBERT

  • Referenced in 1 article [sw37760]
  • Interaction over BERT. Recent progress in Natural Language Understanding (NLU) is driving fast-paced advances...