MacNet: Transferring Knowledge from Machine Comprehension to Sequence-to-Sequence Models

07/23/2019
by   Boyuan Pan, et al.
0

Machine Comprehension (MC) is one of the core problems in natural language processing, requiring both understanding of the natural language and knowledge about the world. Rapid progress has been made since the release of several benchmark datasets, and recently the state-of-the-art models even surpass human performance on the well-known SQuAD evaluation. In this paper, we transfer knowledge learned from machine comprehension to the sequence-to-sequence tasks to deepen the understanding of the text. We propose MacNet: a novel encoder-decoder supplementary architecture to the widely used attention-based sequence-to-sequence models. Experiments on neural machine translation (NMT) and abstractive text summarization show that our proposed framework can significantly improve the performance of the baseline models, and our method for the abstractive text summarization achieves the state-of-the-art results on the Gigaword dataset.

READ FULL TEXT
research
10/20/2018

Abstractive Summarization Using Attentive Neural Techniques

In a world of proliferating data, the ability to rapidly summarize text ...
research
10/10/2021

Enhance Long Text Understanding via Distilled Gist Detector from Abstractive Summarization

Long text understanding is important yet challenging in natural language...
research
05/24/2018

Deep Reinforcement Learning For Sequence to Sequence Models

In recent years, sequence-to-sequence (seq2seq) models are used in a var...
research
09/25/2020

Persian Keyphrase Generation Using Sequence-to-Sequence Models

Keyphrases are a very short summary of an input text and provide the mai...
research
11/05/2018

Structured Neural Summarization

Summarization of long sequences into a concise statement is a core probl...
research
10/29/2019

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

We present BART, a denoising autoencoder for pretraining sequence-to-seq...
research
02/14/2022

Sequence-to-Sequence Resources for Catalan

In this work, we introduce sequence-to-sequence language resources for C...

Please sign up or login with your details

Forgot password? Click here to reset