Data augmentation in Bayesian neural networks and the cold posterior effect

06/10/2021
by   Seth Nabarro, et al.
0

Data augmentation is a highly effective approach for improving performance in deep neural networks. The standard view is that it creates an enlarged dataset by adding synthetic data, which raises a problem when combining it with Bayesian inference: how much data are we really conditioning on? This question is particularly relevant to recent observations linking data augmentation to the cold posterior effect. We investigate various principled ways of finding a log-likelihood for augmented datasets. Our approach prescribes augmenting the same underlying image multiple times, both at test and train-time, and averaging either the logits or the predictive probabilities. Empirically, we observe the best performance with averaging probabilities. While there are interactions with the cold posterior effect, neither averaging logits or averaging probabilities eliminates it.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

Disentangling the Roles of Curation, Data-Augmentation and the Prior in the Cold Posterior Effect

The "cold posterior effect" (CPE) in Bayesian deep learning describes th...
research
03/15/2012

Bayesian Model Averaging Using the k-best Bayesian Network Structures

We study the problem of learning Bayesian network structures from data. ...
research
03/30/2022

On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification

Aleatoric uncertainty captures the inherent randomness of the data, such...
research
05/27/2022

How Tempering Fixes Data Augmentation in Bayesian Neural Networks

While Bayesian neural networks (BNNs) provide a sound and principled alt...
research
07/20/2021

A Bayesian Approach to Invariant Deep Neural Networks

We propose a novel Bayesian neural network architecture that can learn i...
research
10/19/2020

Introducing and Applying Newtonian Blurring: An Augmented Dataset of 126,000 Human Connectomes at braingraph.org

Gaussian blurring is a well-established method for image data augmentati...
research
11/23/2021

Weight Pruning and Uncertainty in Radio Galaxy Classification

In this work we use variational inference to quantify the degree of epis...

Please sign up or login with your details

Forgot password? Click here to reset