Variational auto-encoders with Student's t-prior

04/06/2020
by   Najmeh Abiri, et al.
0

We propose a new structure for the variational auto-encoders (VAEs) prior, with the weakly informative multivariate Student's t-distribution. In the proposed model all distribution parameters are trained, thereby allowing for a more robust approximation of the underlying data distribution. We used Fashion-MNIST data in two experiments to compare the proposed VAEs with the standard Gaussian priors. Both experiments showed a better reconstruction of the images with VAEs using Student's t-prior distribution.

READ FULL TEXT

page 3

page 4

research
11/05/2017

Wasserstein Auto-Encoders

We propose the Wasserstein Auto-Encoder (WAE)---a new algorithm for buil...
research
09/18/2018

Comparison between Suitable Priors for Additive Bayesian Networks

Additive Bayesian networks are types of graphical models that extend the...
research
11/19/2017

Diverse and Accurate Image Description Using a Variational Auto-Encoder with an Additive Gaussian Encoding Space

This paper explores image caption generation using conditional variation...
research
06/14/2019

Learning Correlated Latent Representations with Adaptive Priors

Variational Auto-Encoders (VAEs) have been widely applied for learning c...
research
05/14/2019

Correlated Variational Auto-Encoders

Variational Auto-Encoders (VAEs) are capable of learning latent represen...
research
10/08/2017

Reconstruction of Hidden Representation for Robust Feature Extraction

This paper aims to develop a new and robust approach to feature represen...
research
08/12/2016

Student's t Distribution based Estimation of Distribution Algorithms for Derivative-free Global Optimization

In this paper, we are concerned with a branch of evolutionary algorithms...

Please sign up or login with your details

Forgot password? Click here to reset