Towards Supervised Extractive Text Summarization via RNN-based Sequence Classification

11/13/2019
by   Eduardo Brito, et al.
0

This article briefly explains our submitted approach to the DocEng'19 competition on extractive summarization. We implemented a recurrent neural network based model that learns to classify whether an article's sentence belongs to the corresponding extractive summary or not. We bypass the lack of large annotated news corpora for extractive summarization by generating extractive summaries from abstractive ones, which are available from the CNN corpus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2016

SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents

We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence ...
research
06/19/2015

LCSTS: A Large Scale Chinese Short Text Summarization Dataset

Automatic text summarization is widely regarded as the highly difficult ...
research
01/12/2019

What comes next? Extractive summarization by next-sentence prediction

Existing approaches to automatic summarization assume that a length limi...
research
05/31/2022

NEWTS: A Corpus for News Topic-Focused Summarization

Text summarization models are approaching human levels of fidelity. Exis...
research
09/04/2019

ScisummNet: A Large Annotated Corpus and Content-Impact Models for Scientific Paper Summarization with Citation Networks

Scientific article summarization is challenging: large, annotated corpor...
research
12/19/2020

Self-Supervision based Task-Specific Image Collection Summarization

Successful applications of deep learning (DL) requires large amount of a...
research
09/04/2019

ScisummNet: A Large Annotated Dataset and Content-Impact Models for Scientific Paper Summarization with Citation Networks

Scientific article summarization is challenging: large, annotated corpor...

Please sign up or login with your details

Forgot password? Click here to reset