Sentence Simplification with Memory-Augmented Neural Networks

04/20/2018
by   Tu Vu, et al.
0

Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications. Recent advances in neural machine translation have paved the way for novel approaches to the task. In this paper, we adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Our experiments demonstrate the effectiveness of our approach on different simplification datasets, both in terms of automatic evaluation measures and human judgments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2018

Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model

Neural machine translation (NMT) systems are usually trained on a large ...
research
07/14/2016

Neural Semantic Encoders

We present a memory augmented neural network for natural language unders...
research
04/13/2017

Learning Joint Multilingual Sentence Representations with Neural Machine Translation

In this paper, we use the framework of neural machine translation to lea...
research
04/23/2018

Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks

Semantic representations have long been argued as potentially useful for...
research
10/10/2018

Improving Neural Text Simplification Model with Simplified Corpora

Text simplification (TS) can be viewed as monolingual translation task, ...
research
04/27/2020

Intelligent Translation Memory Matching and Retrieval with Sentence Encoders

Matching and retrieving previously translated segments from a Translatio...
research
05/18/2016

Modelling Interaction of Sentence Pair with coupled-LSTMs

Recently, there is rising interest in modelling the interactions of two ...

Please sign up or login with your details

Forgot password? Click here to reset