Likelihood Assignment for Out-of-Distribution Inputs in Deep Generative Models is Sensitive to Prior Distribution Choice

by   Ryo Kamoi, et al.

Recent work has shown that deep generative models assign higher likelihood to out-of-distribution inputs than to training data. We show that a factor underlying this phenomenon is a mismatch between the nature of the prior distribution and that of the data distribution, a problem found in widely used deep generative models such as VAEs and Glow. While a typical choice for a prior distribution is a standard Gaussian distribution, properties of distributions of real data sets may not be consistent with a unimodal prior distribution. This paper focuses on the relationship between the choice of a prior distribution and the likelihoods assigned to out-of-distribution inputs. We propose the use of a mixture distribution as a prior to make likelihoods assigned by deep generative models sensitive to out-of-distribution inputs. Furthermore, we explain the theoretical advantages of adopting a mixture distribution as the prior, and we present experimental results to support our claims. Finally, we demonstrate that a mixture prior lowers the out-of-distribution likelihood with respect to two pairs of real image data sets: Fashion-MNIST vs. MNIST and CIFAR10 vs. SVHN.


page 3

page 14

page 15


Detecting Out-of-Distribution Inputs to Deep Generative Models Using a Test for Typicality

Recent work has shown that deep generative models can assign higher like...

Understanding Failures in Out-of-Distribution Detection with Deep Generative Models

Deep generative models (DGMs) seem a natural fit for detecting out-of-di...

Deep Generative Models Strike Back! Improving Understanding and Evaluation in Light of Unmet Expectations for OoD Data

Advances in deep generative and density models have shown impressive cap...

Bayesian Autoencoders: Analysing and Fixing the Bernoulli likelihood for Out-of-Distribution Detection

After an autoencoder (AE) has learnt to reconstruct one dataset, it migh...

On the Generative Utility of Cyclic Conditionals

We study whether and how can we model a joint distribution p(x,z) using ...

A Factorial Mixture Prior for Compositional Deep Generative Models

We assume that a high-dimensional datum, like an image, is a composition...

Entropic Issues in Likelihood-Based OOD Detection

Deep generative models trained by maximum likelihood remain very popular...

Please sign up or login with your details

Forgot password? Click here to reset