Understanding Failures in Out-of-Distribution Detection with Deep Generative Models

by   Lily H. Zhang, et al.

Deep generative models (DGMs) seem a natural fit for detecting out-of-distribution (OOD) inputs, but such models have been shown to assign higher probabilities or densities to OOD images than images from the training distribution. In this work, we explain why this behavior should be attributed to model misestimation. We first prove that no method can guarantee performance beyond random chance without assumptions on which out-distributions are relevant. We then interrogate the typical set hypothesis, the claim that relevant out-distributions can lie in high likelihood regions of the data distribution, and that OOD detection should be defined based on the data distribution's typical set. We highlight the consequences implied by assuming support overlap between in- and out-distributions, as well as the arbitrariness of the typical set for OOD detection. Our results suggest that estimation error is a more plausible explanation than the misalignment between likelihood-based OOD detection and out-distributions of interest, and we illustrate how even minimal estimation error can lead to OOD detection failures, yielding implications for future work in deep generative modeling and OOD detection.


page 1

page 2

page 3

page 4


Entropic Issues in Likelihood-Based OOD Detection

Deep generative models trained by maximum likelihood remain very popular...

Hierarchical VAEs Know What They Don't Know

Deep generative models have shown themselves to be state-of-the-art dens...

Further Analysis of Outlier Detection with Deep Generative Models

The recent, counter-intuitive discovery that deep generative models (DGM...

Unsupervised Out-of-Distribution Detection with Batch Normalization

Likelihood from a generative model is a natural statistic for detecting ...

Out-of-Distribution Detection with Class Ratio Estimation

Density-based Out-of-distribution (OOD) detection has recently been show...

DOI: Divergence-based Out-of-Distribution Indicators via Deep Generative Models

To ensure robust and reliable classification results, OoD (out-of-distri...

Please sign up or login with your details

Forgot password? Click here to reset