A Simple Baseline for Bayesian Uncertainty in Deep Learning

02/07/2019
by   Wesley Maddox, et al.
20

We propose SWA-Gaussian (SWAG), a simple, scalable, and general purpose approach for uncertainty representation and calibration in deep learning. Stochastic Weight Averaging (SWA), which computes the first moment of stochastic gradient descent (SGD) iterates with a modified learning rate schedule, has recently been shown to improve generalization in deep learning. With SWAG, we fit a Gaussian using the SWA solution as the first moment and a low rank plus diagonal covariance also derived from the SGD iterates, forming an approximate posterior distribution over neural network weights; we then sample from this Gaussian distribution to perform Bayesian model averaging. We empirically find that SWAG approximates the shape of the true posterior, in accordance with results describing the stationary distribution of SGD iterates. Moreover, we demonstrate that SWAG performs well on a wide variety of computer vision tasks, including out of sample detection, calibration, and transfer learning, in comparison to many popular alternatives including MC dropout, KFAC Laplace, and temperature scaling.

READ FULL TEXT
research
04/13/2017

Stochastic Gradient Descent as Approximate Bayesian Inference

Stochastic Gradient Descent with a constant learning rate (constant SGD)...
research
03/14/2018

Averaging Weights Leads to Wider Optima and Better Generalization

Deep neural networks are typically trained by optimizing a loss function...
research
11/11/2018

SLANG: Fast Structured Covariance Approximations for Bayesian Deep Learning with Natural Gradient

Uncertainty estimation in large deep-learning models is a computationall...
research
01/03/2022

Stochastic Weight Averaging Revisited

Stochastic weight averaging (SWA) is recognized as a simple while one ef...
research
11/30/2018

Eigenvalue Corrected Noisy Natural Gradient

Variational Bayesian neural networks combine the flexibility of deep lea...
research
06/06/2023

Machine learning in and out of equilibrium

The algorithms used to train neural networks, like stochastic gradient d...
research
04/26/2019

SWALP : Stochastic Weight Averaging in Low-Precision Training

Low precision operations can provide scalability, memory savings, portab...

Please sign up or login with your details

Forgot password? Click here to reset