Sentence Simplification with Memory-Augmented Neural Networks

by   Tu Vu, et al.
University of Massachusetts Amherst
University of Massachusetts Medical School

Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications. Recent advances in neural machine translation have paved the way for novel approaches to the task. In this paper, we adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Our experiments demonstrate the effectiveness of our approach on different simplification datasets, both in terms of automatic evaluation measures and human judgments.


page 1

page 2

page 3

page 4


Fusing Recency into Neural Machine Translation with an Inter-Sentence Gate Model

Neural machine translation (NMT) systems are usually trained on a large ...

Neural Semantic Encoders

We present a memory augmented neural network for natural language unders...

Learning Joint Multilingual Sentence Representations with Neural Machine Translation

In this paper, we use the framework of neural machine translation to lea...

Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks

Semantic representations have long been argued as potentially useful for...

Improving Neural Text Simplification Model with Simplified Corpora

Text simplification (TS) can be viewed as monolingual translation task, ...

Intelligent Translation Memory Matching and Retrieval with Sentence Encoders

Matching and retrieving previously translated segments from a Translatio...

Modelling Interaction of Sentence Pair with coupled-LSTMs

Recently, there is rising interest in modelling the interactions of two ...

Please sign up or login with your details

Forgot password? Click here to reset