Enhanced Variational Inference with Dyadic Transformation

01/30/2019
by   Sarin Chandy, et al.
0

Variational autoencoder is a powerful deep generative model with variational inference. The practice of modeling latent variables in the VAE's original formulation as normal distributions with a diagonal covariance matrix limits the flexibility to match the true posterior distribution. We propose a new transformation, dyadic transformation (DT), that can model a multivariate normal distribution. DT is a single-stage transformation with low computational requirements. We demonstrate empirically on MNIST dataset that DT enhances the posterior flexibility and attains competitive results compared to other VAE enhancements.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset