Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

03/05/2018
by   Shuming Ma, et al.
0

Most recent approaches use the sequence-to-sequence model for paraphrase generation. The existing sequence-to-sequence model tends to memorize the words and the patterns in the training dataset instead of learning the meaning of the words. Therefore, the generated sentences are often grammatically correct but semantically improper. In this work, we introduce a novel model based on the encoder-decoder framework, called Word Embedding Attention Network (WEAN). Our proposed model generates the words by querying distributed word representations (i.e. neural word embeddings), hoping to capturing the meaning of the according words. Following previous work, we evaluate our model on two paraphrase-oriented tasks, namely text simplification and short text abstractive summarization. Experimental results show that our model outperforms the sequence-to-sequence baseline by the BLEU score of 6.3 and 5.5 on two English text simplification datasets, and the ROUGE-2 F1 score of 5.7 on a Chinese summarization dataset. Moreover, our model achieves state-of-the-art performances on these three benchmark datasets.

READ FULL TEXT
research
03/05/2018

Word Embedding Attention Network: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

Most recent approaches use the sequence-to-sequence model for paraphrase...
research
08/19/2016

Learning to Start for Sequence to Sequence Architecture

The sequence to sequence architecture is widely used in the response gen...
research
05/15/2018

Simplifying Sentences with Sequence to Sequence Models

We simplify sentences with an attentive neural network sequence to seque...
research
11/30/2019

Tag Recommendation by Word-Level Tag Sequence Modeling

In this paper, we transform tag recommendation into a word-based text ge...
research
03/16/2018

A Meaning-based Statistical English Math Word Problem Solver

We introduce MeSys, a meaning-based approach to solving English math wor...
research
11/28/2016

Joint Copying and Restricted Generation for Paraphrase

Many natural language generation tasks, such as abstractive summarizatio...
research
02/05/2021

Spell Correction for Azerbaijani Language using Deep Neural Networks

Spell correction is used to detect and correct orthographic mistakes in ...

Please sign up or login with your details

Forgot password? Click here to reset