Dynamic Sliding Window for Meeting Summarization

08/31/2021
by   Zhengyuan Liu, et al.
0

Recently abstractive spoken language summarization raises emerging research interest, and neural sequence-to-sequence approaches have brought significant performance improvement. However, summarizing long meeting transcripts remains challenging. Due to the large length of source contents and targeted summaries, neural models are prone to be distracted on the context, and produce summaries with degraded quality. Moreover, pre-trained language models with input length limitations cannot be readily applied to long sequences. In this work, we first analyze the linguistic characteristics of meeting transcripts on a representative corpus, and find that the sentences comprising the summary correlate with the meeting agenda. Based on this observation, we propose a dynamic sliding window strategy for meeting summarization. Experimental results show that performance benefit from the proposed method, and outputs obtain higher factual consistency than the base model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2023

GUMSum: Multi-Genre Data and Evaluation for English Abstractive Summarization

Automatic summarization with pre-trained language models has led to impr...
research
09/07/2019

On Extractive and Abstractive Neural Document Summarization with Transformer Language Models

We present a method to produce abstractive summaries of long documents t...
research
06/09/2020

Combination of abstractive and extractive approaches for summarization of long scientific texts

In this research work, we present a method to generate summaries of long...
research
04/07/2020

Windowing Models for Abstractive Summarization of Long Texts

Neural summarization models suffer from the fixed-size input limitation:...
research
05/13/2023

Self-Supervised Sentence Compression for Meeting Summarization

The conventional summarization model often fails to capture critical inf...
research
04/17/2021

Transductive Learning for Abstractive News Summarization

Pre-trained language models have recently advanced abstractive summariza...
research
09/16/2017

Order-Preserving Abstractive Summarization for Spoken Content Based on Connectionist Temporal Classification

Connectionist temporal classification (CTC) is a powerful approach for s...

Please sign up or login with your details

Forgot password? Click here to reset