HEGEL: Hypergraph Transformer for Long Document Summarization

by   Haopeng Zhang, et al.

Extractive summarization for long documents is challenging due to the extended structured input context. The long-distance sentence dependency hinders cross-sentence relations modeling, the critical step of extractive summarization. This paper proposes HEGEL, a hypergraph neural network for long document summarization by capturing high-order cross-sentence relations. HEGEL updates and learns effective sentence representations with hypergraph transformer layers and fuses different types of sentence dependencies, including latent topics, keywords coreference, and section structure. We validate HEGEL by conducting extensive experiments on two benchmark datasets, and experimental results demonstrate the effectiveness and efficiency of HEGEL.


page 1

page 2

page 3

page 4


Hi-Transformer: Hierarchical Interactive Transformer for Efficient and Effective Long Document Modeling

Transformer is important for text modeling. However, it has difficulty i...

Enhancing Extractive Text Summarization with Topic-Aware Graph Neural Networks

Text summarization aims to compress a textual document to a short summar...

Contrastive Hierarchical Discourse Graph for Scientific Document Summarization

The extended structural context has made scientific paper summarization ...

Query-oriented text summarization based on hypergraph transversals

Existing graph- and hypergraph-based algorithms for document summarizati...

Structure-Infused Copy Mechanisms for Abstractive Summarization

Seq2seq learning has produced promising results on summarization. Howeve...

Incorporating Linguistic Knowledge for Abstractive Multi-document Summarization

Within natural language processing tasks, linguistic knowledge can alway...

HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization

To capture the semantic graph structure from raw text, most existing sum...

Please sign up or login with your details

Forgot password? Click here to reset