Are Bayesian neural networks intrinsically good at out-of-distribution detection?

07/26/2021
by   Christian Henning, et al.
63

The need to avoid confident predictions on unfamiliar data has sparked interest in out-of-distribution (OOD) detection. It is widely assumed that Bayesian neural networks (BNN) are well suited for this task, as the endowed epistemic uncertainty should lead to disagreement in predictions on outliers. In this paper, we question this assumption and provide empirical evidence that proper Bayesian inference with common neural network architectures does not necessarily lead to good OOD detection. To circumvent the use of approximate inference, we start by studying the infinite-width case, where Bayesian inference can be exact considering the corresponding Gaussian process. Strikingly, the kernels induced under common architectural choices lead to uncertainties that do not reflect the underlying data generating process and are therefore unsuited for OOD detection. Finally, we study finite-width networks using HMC, and observe OOD behavior that is consistent with the infinite-width case. Overall, our study discloses fundamental problems when naively using BNNs for OOD detection and opens interesting avenues for future research.

READ FULL TEXT

page 4

page 7

page 8

page 9

page 11

research
10/12/2021

Uncertainty-based out-of-distribution detection requires suitable function space priors

The need to avoid confident predictions on unfamiliar data has sparked i...
research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
11/16/2022

An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width Bayesian Neural Networks

Comparing Bayesian neural networks (BNNs) with different widths is chall...
research
12/05/2019

Neural Tangents: Fast and Easy Infinite Neural Networks in Python

Neural Tangents is a library designed to enable research into infinite-w...
research
10/06/2021

Bayesian neural network unit priors and generalized Weibull-tail property

The connection between Bayesian neural networks and Gaussian processes g...
research
11/23/2021

Depth induces scale-averaging in overparameterized linear Bayesian neural networks

Inference in deep Bayesian neural networks is only fully understood in t...
research
10/17/2019

Why bigger is not always better: on finite and infinite neural networks

Recent work has shown that the outputs of convolutional neural networks ...

Please sign up or login with your details

Forgot password? Click here to reset