DeepAI AI Chat
Log In Sign Up

Unsupervised Out-of-Distribution Detection with Batch Normalization

by   Jiaming Song, et al.

Likelihood from a generative model is a natural statistic for detecting out-of-distribution (OoD) samples. However, generative models have been shown to assign higher likelihood to OoD samples compared to ones from the training distribution, preventing simple threshold-based detection rules. We demonstrate that OoD detection fails even when using more sophisticated statistics based on the likelihoods of individual samples. To address these issues, we propose a new method that leverages batch normalization. We argue that batch normalization for generative models challenges the traditional i.i.d. data assumption and changes the corresponding maximum likelihood objective. Based on this insight, we propose to exploit in-batch dependencies for OoD detection. Empirical results suggest that this leads to more robust detection for high-dimensional images.


Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder

Deep probabilistic generative models enable modeling the likelihoods of ...

Generative Ensembles for Robust Anomaly Detection

Deep generative models are capable of learning probability distributions...

Out-of-Distribution Detection Using Neural Rendering Generative Models

Out-of-distribution (OoD) detection is a natural downstream task for dee...

Understanding Failures in Out-of-Distribution Detection with Deep Generative Models

Deep generative models (DGMs) seem a natural fit for detecting out-of-di...

Autoencoding Under Normalization Constraints

Likelihood is a standard estimate for outlier detection. The specific ro...

How (not) to Train your Generative Model: Scheduled Sampling, Likelihood, Adversary?

Modern applications and progress in deep learning research have created ...

Entropic Issues in Likelihood-Based OOD Detection

Deep generative models trained by maximum likelihood remain very popular...