Deep State Space Models for Unconditional Word Generation

06/12/2018
by   Florian Schmidt, et al.
0

Autoregressive feedback is considered a necessity for successful unconditional text generation using stochastic sequence models. However, such feedback is known to introduce systematic biases into the training and it obscures a principle of generation: committing to global information and forgetting local nuances. We show that a non-autoregressive deep state space model with a clear separation of global and local uncertainty can be build from only two ingredients: An independent noise source and a deterministic transition function. Recent advances on flow-based variational inference allow training an evidence lower-bound without resorting to annealing, auxiliary losses or similar measures. The result is a highly interpretable generative model on par with a comparable auto-regressive model on the task of word generation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset