Query Focused Abstractive Summarization: Incorporating Query Relevance, Multi-Document Coverage, and Summary Length Constraints into seq2seq Models

01/23/2018
by   Tal Baumel, et al.
0

Query Focused Summarization (QFS) has been addressed mostly using extractive methods. Such methods, however, produce text which suffers from low coherence. We investigate how abstractive methods can be applied to QFS, to overcome such limitations. Recent developments in neural-attention based sequence-to-sequence models have led to state-of-the-art results on the task of abstractive generic single document summarization. Such models are trained in an end to end method on large amounts of training data. We address three aspects to make abstractive summarization applicable to QFS: (a)since there is no training data, we incorporate query relevance into a pre-trained abstractive model; (b) since existing abstractive models are trained in a single-document setting, we design an iterated method to embed abstractive models within the multi-document requirement of QFS; (c) the abstractive models we adapt are trained to generate text of specific length (about 100 words), while we aim at generating output of a different size (about 250 words); we design a way to adapt the target size of the generated summaries to a given size ratio. We compare our method (Relevance Sensitive Attention for QFS) to extractive baselines and with various ways to combine abstractive models on the DUC QFS datasets and demonstrate solid improvements on ROUGE performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2021

Domain Adaptation with Pre-trained Transformers for Query Focused Abstractive Text Summarization

The Query Focused Text Summarization (QFTS) task aims at building system...
research
05/27/2021

Improve Query Focused Abstractive Summarization by Incorporating Answer Relevance

Query focused summarization (QFS) models aim to generate summaries from ...
research
12/29/2020

Abstractive Query Focused Summarization with Query-Free Resources

The availability of large-scale datasets has driven the development of n...
research
04/23/2023

A Lightweight Constrained Generation Alternative for Query-focused Summarization

Query-focused summarization (QFS) aims to provide a summary of a documen...
research
04/01/2016

AttSum: Joint Learning of Focusing and Summarization with Neural Attention

Query relevance ranking and sentence saliency ranking are the two main t...
research
06/15/2020

DynE: Dynamic Ensemble Decoding for Multi-Document Summarization

Sequence-to-sequence (s2s) models are the basis for extensive work in na...
research
12/18/2019

Generating summaries tailored to target characteristics

Recently, research efforts have gained pace to cater to varied user pref...

Please sign up or login with your details

Forgot password? Click here to reset