Momentum Decoding: Open-ended Text Generation As Graph Exploration

by   Tian Lan, et al.

Open-ended text generation with autoregressive language models (LMs) is one of the core tasks in natural language processing. However, maximization-based decoding methods (e.g., greedy/beam search) often lead to the degeneration problem, i.e., the generated text is unnatural and contains undesirable repetitions. Existing solutions to this problem either introduce randomness prone to incoherence or require a look-ahead mechanism that demands extra computational overhead. In this study, we formulate open-ended text generation from a new perspective, i.e., we view it as an exploration process within a directed graph. Thereby, we understand the phenomenon of degeneration as circular loops within the directed graph. Based on our formulation, we propose a novel decoding method – momentum decoding – which encourages the LM to greedily explore new nodes outside the current graph. Meanwhile, it also allows the LM to return to the existing nodes with a momentum downgraded by a pre-defined resistance function. We extensively test our approach on three benchmarks from different domains through automatic and human evaluations. The results show that momentum decoding performs comparably with the current state of the art while enjoying notably improved inference speed and computation FLOPs. Furthermore, we conduct a detailed analysis to reveal the merits and inner workings of our approach. Our codes and other related resources are publicly available at


page 1

page 2

page 3

page 4


A Contrastive Framework for Neural Text Generation

Text generation is of great importance to many natural language processi...

An Empirical Study On Contrastive Search And Contrastive Decoding For Open-ended Text Generation

In the study, we empirically compare the two recently proposed decoding ...

Copy Is All You Need

The dominant text generation models compose the output by sequentially s...

DART: Open-Domain Structured Data Record to Text Generation

We introduce DART, a large dataset for open-domain structured data recor...

A Frustratingly Simple Decoding Method for Neural Text Generation

We introduce a frustratingly simple, super efficient and surprisingly ef...

DISCO : efficient unsupervised decoding for discrete natural language problems via convex relaxation

In this paper we study test time decoding; an ubiquitous step in almost ...

The Stable Entropy Hypothesis and Entropy-Aware Decoding: An Analysis and Algorithm for Robust Natural Language Generation

State-of-the-art language generation models can degenerate when applied ...

Please sign up or login with your details

Forgot password? Click here to reset