Deep Communicating Agents for Abstractive Summarization

03/27/2018
by   Asli Celikyilmaz, et al.
0

We present deep communicating agents in an encoder-decoder architecture to address the challenges of representing a long document for abstractive summarization. With deep communicating agents, the task of encoding a long text is divided across multiple collaborating agents, each in charge of a subsection of the input text. These encoders are connected to a single decoder, trained end-to-end using reinforcement learning to generate a focused and coherent summary. Empirical results demonstrate that multiple communicating encoders lead to a higher quality summary compared to several strong baselines, including those based on a single encoder or multiple non-communicating encoders.

READ FULL TEXT
research
05/27/2021

Self-Supervised Multimodal Opinion Summarization

Recently, opinion summarization, which is the generation of a summary fr...
research
05/04/2020

Noise Pollution in Hospital Readmission Prediction: Long Document Classification with Reinforcement Learning

This paper presents a reinforcement learning approach to extract noise i...
research
08/19/2018

Adapting the Neural Encoder-Decoder Framework from Single to Multi-Document Summarization

Generating an abstract from a set of relevant documents remains challeng...
research
02/21/2020

On the impressive performance of randomly weighted encoders in summarization tasks

In this work, we investigate the performance of untrained randomly initi...
research
10/24/2022

Controlled Text Reduction

Producing a reduced version of a source text, as in generic or focused s...
research
10/03/2022

Probing of Quantitative Values in Abstractive Summarization Models

Abstractive text summarization has recently become a popular approach, b...
research
02/25/2023

Abstractive Text Summarization using Attentive GRU based Encoder-Decoder

In todays era huge volume of information exists everywhere. Therefore, i...

Please sign up or login with your details

Forgot password? Click here to reset