Robustly representing inferential uncertainty in deep neural networks through sampling

11/05/2016
by   Patrick McClure, et al.
0

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modeling uncertainty is one of the key features of Bayesian methods. Using Bernoulli dropout with sampling at prediction time has recently been proposed as an efficient and well performing variational inference method for DNNs. However, sampling from other multiplicative noise based variational distributions has not been investigated in depth. We evaluated Bayesian DNNs trained with Bernoulli or Gaussian multiplicative masking of either the units (dropout) or the weights (dropconnect). We tested the calibration of the probabilistic predictions of Bayesian convolutional neural networks (CNNs) on MNIST and CIFAR-10. Sampling at prediction time increased the calibration of the DNNs' probabalistic predictions. Sampling weights, whether Gaussian or Bernoulli, led to more robust representation of uncertainty compared to sampling of units. However, using either Gaussian or Bernoulli dropout led to increased test set classification accuracy. Based on these findings we used both Bernoulli dropout and Gaussian dropconnect concurrently, which we show approximates the use of a spike-and-slab variational distribution without increasing the number of learned parameters. We found that spike-and-slab sampling had higher test set performance than Gaussian dropconnect and more robustly represented its uncertainty compared to Bernoulli dropout.

READ FULL TEXT
research
06/20/2020

Calibration of Model Uncertainty for Dropout Variational Inference

The model uncertainty obtained by variational Bayesian inference with Mo...
research
10/18/2017

Dropout Sampling for Robust Object Detection in Open-Set Conditions

Dropout Variational Inference, or Dropout Sampling, has been recently pr...
research
03/08/2017

Dropout Inference in Bayesian Neural Networks with Alpha-divergences

To obtain uncertainty estimates with real-world Bayesian deep learning m...
research
06/23/2019

Confidence Calibration for Convolutional Neural Networks Using Structured Dropout

In classification applications, we often want probabilistic predictions ...
research
06/06/2015

Bayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference

Convolutional neural networks (CNNs) work well on large datasets. But la...
research
10/11/2020

Advanced Dropout: A Model-free Methodology for Bayesian Dropout Optimization

Due to lack of data, overfitting ubiquitously exists in real-world appli...
research
06/20/2022

Actively Learning Deep Neural Networks with Uncertainty Sampling Based on Sum-Product Networks

Active learning is popular approach for reducing the amount of data in t...

Please sign up or login with your details

Forgot password? Click here to reset