Large Language Models are Strong Zero-Shot Retriever

04/27/2023
by   Tao Shen, et al.
0

In this work, we propose a simple method that applies a large language model (LLM) to large-scale retrieval in zero-shot scenarios. Our method, Language language model as Retriever (LameR) is built upon no other neural models but an LLM, while breaking up brute-force combinations of retrievers with LLMs and lifting the performance of zero-shot retrieval to be very competitive on benchmark datasets. Essentially, we propose to augment a query with its potential answers by prompting LLMs with a composition of the query and the query's in-domain candidates. The candidates, regardless of correct or wrong, are obtained by a vanilla retrieval procedure on the target collection. Such candidates, as a part of prompts, are likely to help LLM generate more precise answers by pattern imitation or candidate summarization. Even if all the candidates are wrong, the prompts at least make LLM aware of in-collection patterns and genres. Moreover, due to the low performance of a self-supervised retriever, the LLM-based query augmentation becomes less effective as the retriever bottlenecks the whole pipeline. So, we propose to leverage a non-parametric lexicon-based method (e.g., BM25) as the retrieval module to capture query-document overlap in a literal fashion. As such, LameR makes the retrieval procedure transparent to the LLM, so it circumvents the performance bottleneck.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2022

Zemi: Learning Zero-Shot Semi-Parametric Language Models from Multiple Tasks

Although large language models have achieved impressive zero-shot abilit...
research
10/19/2022

TabLLM: Few-shot Classification of Tabular Data with Large Language Models

We study the application of large language models to zero-shot and few-s...
research
08/16/2023

Pre-training with Large Language Model-based Document Expansion for Dense Passage Retrieval

In this paper, we systematically study the potential of pre-training wit...
research
05/27/2022

Nearest Neighbor Zero-Shot Inference

We introduce kNN-Prompt, a simple and effective technique to use k-neare...
research
01/25/2022

Out-of-Domain Semantics to the Rescue! Zero-Shot Hybrid Retrieval Models

The pre-trained language model (eg, BERT) based deep retrieval models ac...
research
12/20/2022

Precise Zero-Shot Dense Retrieval without Relevance Labels

While dense retrieval has been shown effective and efficient across task...
research
08/31/2022

LexMAE: Lexicon-Bottlenecked Pretraining for Large-Scale Retrieval

In large-scale retrieval, the lexicon-weighting paradigm, learning weigh...

Please sign up or login with your details

Forgot password? Click here to reset