Modern language models capture a large body of factual knowledge. Howeve...
Text-to-image diffusion models have demonstrated an unparalleled ability...
A prominent weakness of modern language models (LMs) is their tendency t...
Transformer-based language models (LMs) are known to capture factual
kno...
Transformer-based language models (LMs) create hidden representations of...
Language models are trained on large volumes of text, and as a result th...
To produce accurate predictions, language models (LMs) must balance betw...
Understanding Transformer-based models has attracted significant attenti...
In recent years, progress in NLU has been driven by benchmarks. These
be...
A prominent challenge for modern language understanding systems is the
a...
The opaque nature and unexplained behavior of transformer-based language...
Transformer-based language models (LMs) are at the core of modern NLP, b...
NLP benchmarks have largely focused on short texts, such as sentences an...
Recent efforts to create challenge benchmarks that test the abilities of...
The primary paradigm for multi-task training in natural language process...
A key limitation in current datasets for multi-hop reasoning is that the...
Feed-forward layers constitute two-thirds of a transformer model's
param...
Large pre-trained language models (LMs) are known to encode substantial
...
Understanding natural language questions entails the ability to break do...
Crowdsourcing has been the prevalent paradigm for creating natural langu...
Sentence fusion is the task of joining several independent sentences int...
Training agents to communicate with one another given task-based supervi...
Reading comprehension models are based on recurrent neural networks that...
Semantic parsing shines at analyzing complex natural language that invol...