HyperVAE: A Minimum Description Length Variational Hyper-Encoding Network

05/18/2020
by   Phuoc Nguyen, et al.
56

We propose a framework called HyperVAE for encoding distributions of distributions. When a target distribution is modeled by a VAE, its neural network parameters θ is drawn from a distribution p(θ) which is modeled by a hyper-level VAE. We propose a variational inference using Gaussian mixture models to implicitly encode the parameters θ into a low dimensional Gaussian distribution. Given a target distribution, we predict the posterior distribution of the latent code, then use a matrix-network decoder to generate a posterior distribution q(θ). HyperVAE can encode the parameters θ in full in contrast to common hyper-networks practices, which generate only the scale and bias vectors as target-network parameters. Thus HyperVAE preserves much more information about the model for each task in the latent space. We discuss HyperVAE using the minimum description length (MDL) principle and show that it helps HyperVAE to generalize. We evaluate HyperVAE in density estimation tasks, outlier detection and discovery of novel design classes, demonstrating its efficacy.

READ FULL TEXT
research
11/15/2021

Natural Gradient Variational Inference with Gaussian Mixture Models

Bayesian methods estimate a measure of uncertainty by using the posterio...
research
10/24/2021

Regularizing Variational Autoencoder with Diversity and Uncertainty Awareness

As one of the most popular generative models, Variational Autoencoder (V...
research
10/06/2017

Learnable Explicit Density for Continuous Latent Space and Variational Inference

In this paper, we study two aspects of the variational autoencoder (VAE)...
research
12/22/2018

Disentangling Latent Space for VAE by Label Relevant/Irrelevant Dimensions

VAE requires the standard Gaussian distribution as a prior in the latent...
research
06/18/2018

Nonparametric Topic Modeling with Neural Inference

This work focuses on combining nonparametric topic models with Auto-Enco...
research
04/26/2020

Towards Multimodal Response Generation with Exemplar Augmentation and Curriculum Optimization

Recently, variational auto-encoder (VAE) based approaches have made impr...
research
07/20/2017

Learning to Draw Samples with Amortized Stein Variational Gradient Descent

We propose a simple algorithm to train stochastic neural networks to dra...

Please sign up or login with your details

Forgot password? Click here to reset