Natural Language Inference (NLI) tasks involving temporal inference rema...
Modern language models have the capacity to store and use immense amount...
Pre-trained language models (LMs) are used for knowledge intensive tasks...
Interpretable entity representations (IERs) are sparse embeddings that a...
Language models (LMs) are typically trained once on a large-scale corpus...
Knowledge-Based Visual Question Answering (KBVQA) is a bi-modal task
req...
The growth of cross-lingual pre-trained models has enabled NLP tools to
...
Most benchmark datasets targeting commonsense reasoning focus on everyda...
Pre-trained language models induce dense entity representations that off...
Neural entity typing models typically represent entity types as vectors ...
In standard methodology for natural language processing, entities in tex...
Neural entity linking models are very powerful, but run the risk of
over...
Distantly-labeled data can be used to scale up training of statistical
m...