Syntactically Guided Neural Machine Translation

05/15/2016
by   Felix Stahlberg, et al.
0

We investigate the use of hierarchical phrase-based SMT lattices in end-to-end neural machine translation (NMT). Weight pushing transforms the Hiero scores for complete translation hypotheses, with the full translation grammar score and full n-gram language model score, into posteriors compatible with NMT predictive probabilities. With a slightly modified NMT beam-search decoder we find gains over both Hiero and NMT decoding alone, with practical advantages in extending NMT to very large input and output vocabularies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2017

Improving Neural Machine Translation through Phrase-based Forced Decoding

Compared to traditional statistical machine translation (SMT), neural ma...
research
08/25/2018

Exploring Recombination for Efficient Decoding of Neural Machine Translation

In Neural Machine Translation (NMT), the decoder can capture the feature...
research
08/29/2018

Correcting Length Bias in Neural Machine Translation

We study two problems in neural machine translation (NMT). First, in bea...
research
09/19/2017

Dynamic Oracle for Neural Machine Translation in Decoding Phase

The past several years have witnessed the rapid progress of end-to-end N...
research
04/18/2018

Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation

The end-to-end nature of neural machine translation (NMT) removes many w...
research
05/04/2017

Sharp Models on Dull Hardware: Fast and Accurate Neural Machine Translation Decoding on the CPU

Attentional sequence-to-sequence models have become the new standard for...
research
02/06/2017

Beam Search Strategies for Neural Machine Translation

The basic concept in Neural Machine Translation (NMT) is to train a larg...

Please sign up or login with your details

Forgot password? Click here to reset