Conditional Neural Generation using Sub-Aspect Functions for Extractive News Summarization

04/29/2020
by   Zhengyuan Liu, et al.
0

Much progress has been made in text summarization, fueled by neural architectures using large-scale training corpora. However, reference summaries tend to be position-biased and constructed in an under-constrained fashion, especially for benchmark datasets in the news domain. We propose a neural framework that can flexibly control which sub-aspect functions (i.e. importance, diversity, position) to focus on during summary generation. We demonstrate that automatically extracted summaries with minimal position bias can achieve performance at least equivalent to standard models that take advantage of position bias. We also show that news summaries generated with a focus on diversity can be more preferred by human raters. These results suggest that a more flexible neural summarization framework can provide more control options to tailor to different application needs. This framework is useful because it is often difficult to know or articulate a priori what the user-preferences of certain applications are.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset