In many use-cases, information is stored in text but not available in
st...
Structured and grounded representation of text is typically formalized b...
There has been a lot of interest in understanding what information is
ca...
Generative approaches have been recently shown to be effective for both
...
The factual knowledge acquired during pretraining and stored in the
para...
We present mGENRE, a sequence-to-sequence system for the Multilingual En...
We review the EfficientQA competition from NeurIPS 2020. The competition...
Recently, retrieval systems based on dense representations have led to
i...
Entities are at the center of how we represent and aggregate knowledge. ...
Graph neural networks (GNNs) have become a popular approach to integrati...
Challenging problems such as open-domain question answering, fact checki...
There is a growing interest in probabilistic models defined in
hyper-sph...
Attribution methods assess the contribution of inputs (e.g., words) to t...
Normalising flows (NFS) map two density functions via a differentiable
b...
Most research in reading comprehension has focused on answering question...
The manifold hypothesis states that many kinds of high-dimensional data ...
Deep generative models for graph-structured data offer a new angle on th...
The Variational Auto-Encoder (VAE) is one of the most used unsupervised
...