Noisy Parallel Approximate Decoding for Conditional Recurrent Language Model

by   Kyunghyun Cho, et al.
NYU college

Recent advances in conditional recurrent language modelling have mainly focused on network architectures (e.g., attention mechanism), learning algorithms (e.g., scheduled sampling and sequence-level training) and novel applications (e.g., image/video description generation, speech recognition, etc.) On the other hand, we notice that decoding algorithms/strategies have not been investigated as much, and it has become standard to use greedy or beam search. In this paper, we propose a novel decoding strategy motivated by an earlier observation that nonlinear hidden layers of a deep neural network stretch the data manifold. The proposed strategy is embarrassingly parallelizable without any communication overhead, while improving an existing decoding algorithm. We extensively evaluate it with attention-based neural machine translation on the task of En->Cz translation.


page 1

page 2

page 3

page 4


Trainable Greedy Decoding for Neural Machine Translation

Recent research in neural machine translation has largely focused on two...

Can neural machine translation do simultaneous translation?

We investigate the potential of attention-based neural machine translati...

Importance of a Search Strategy in Neural Dialogue Modelling

Search strategies for generating a response from a neural dialogue model...

Consistency of a Recurrent Language Model With Respect to Incomplete Decoding

Despite strong performance on a variety of tasks, neural sequence models...

A Non-monotonic Self-terminating Language Model

Recent large-scale neural autoregressive sequence models have shown impr...

A Stable and Effective Learning Strategy for Trainable Greedy Decoding

As a widely used approximate search strategy for neural network decoders...

Conditionally Learn to Pay Attention for Sequential Visual Task

Sequential visual task usually requires to pay attention to its current ...

Please sign up or login with your details

Forgot password? Click here to reset