Professor Forcing: A New Algorithm for Training Recurrent Networks

10/27/2016
by   Alex Lamb, et al.
0

The Teacher Forcing algorithm trains recurrent networks by supplying observed sequence values as inputs during training and using the network's own one-step-ahead predictions to do multi-step sampling. We introduce the Professor Forcing algorithm, which uses adversarial domain adaptation to encourage the dynamics of the recurrent network to be the same when training the network and when sampling from the network over multiple time steps. We apply Professor Forcing to language modeling, vocal synthesis on raw waveforms, handwriting generation, and image generation. Empirically we find that Professor Forcing acts as a regularizer, improving test likelihood on character level Penn Treebank and sequential MNIST. We also find that the model qualitatively improves samples, especially when sampling for a large number of time steps. This is supported by human evaluation of sample quality. Trade-offs between Professor Forcing and Scheduled Sampling are discussed. We produce T-SNEs showing that Professor Forcing successfully makes the dynamics of the network during training and sampling more similar.

READ FULL TEXT
research
01/02/2018

Character-level Recurrent Neural Networks in Practice: Comparing Training and Sampling Schemes

Recurrent neural networks are nowadays successfully used in an abundance...
research
06/18/2019

Scheduled Sampling for Transformers

Scheduled sampling is a technique for avoiding one of the known problems...
research
07/21/2022

Auto-regressive Image Synthesis with Integrated Quantization

Deep generative models have achieved conspicuous progress in realistic i...
research
03/02/2020

Tensor Networks for Language Modeling

The tensor network formalism has enjoyed over two decades of success in ...
research
11/01/2022

Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation

Methods for improving the efficiency of deep network training (i.e. the ...
research
06/11/2019

Parallel Scheduled Sampling

Auto-regressive models are widely used in sequence generation problems. ...
research
02/14/2019

Superposition of many models into one

We present a method for storing multiple models within a single set of p...

Please sign up or login with your details

Forgot password? Click here to reset