NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned

01/01/2021
∙
by   Sewon Min, et al.
∙
15
∙

We review the EfficientQA competition from NeurIPS 2020. The competition focused on open-domain question answering (QA), where systems take natural language questions as input and return natural language answers. The aim of the competition was to build systems that can predict correct answers while also satisfying strict on-disk memory budgets. These memory budgets were designed to encourage contestants to explore the trade-off between storing large, redundant, retrieval corpora or the parameters of large learned models. In this report, we describe the motivation and organization of the competition, review the best submissions, and analyze system predictions to inform a discussion of evaluation for open-domain QA.

READ FULL TEXT
research
∙ 11/30/2022

CREPE: Open-Domain Question Answering with False Presuppositions

Information seeking users often pose questions with false presupposition...
research
∙ 02/04/2022

Pirá: A Bilingual Portuguese-English Dataset for Question-Answering about the Ocean

Current research in natural language processing is highly dependent on c...
research
∙ 04/04/2020

Talk to Papers: Bringing Neural Question Answering to Academic Search

We introduce Talk to Papers, which exploits the recent open-domain quest...
research
∙ 03/03/2021

NeurIPS 2020 NLC2CMD Competition: Translating Natural Language to Bash Commands

The NLC2CMD Competition hosted at NeurIPS 2020 aimed to bring the power ...
research
∙ 09/20/2018

A Quantitative Evaluation of Natural Language Question Interpretation for Question Answering Systems

Systematic benchmark evaluation plays an important role in the process o...
research
∙ 06/07/2023

When to Read Documents or QA History: On Unified and Selective Open-domain QA

This paper studies the problem of open-domain question answering, with t...
research
∙ 07/01/2020

Relevance-guided Supervision for OpenQA with ColBERT

Systems for Open-Domain Question Answering (OpenQA) generally depend on ...

Please sign up or login with your details

Forgot password? Click here to reset