Qualitative Analysis of Monte Carlo Dropout

07/03/2020
by   Ronald Seoh, et al.
0

In this report, we present qualitative analysis of Monte Carlo (MC) dropout method for measuring model uncertainty in neural network (NN) models. We first consider the sources of uncertainty in NNs, and briefly review Bayesian Neural Networks (BNN), the group of Bayesian approaches to tackle uncertainties in NNs. After presenting mathematical formulation of MC dropout, we proceed to suggesting potential benefits and associated costs for using MC dropout in typical NN models, with the results from our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2022

Controlled Dropout for Uncertainty Estimation

Uncertainty quantification in a neural network is one of the most discus...
research
07/29/2018

Efficient Uncertainty Estimation for Semantic Segmentation in Videos

Uncertainty estimation in deep learning becomes more important recently....
research
08/06/2020

Notes on the Behavior of MC Dropout

Among the various options to estimate uncertainty in deep neural network...
research
06/09/2021

Ex uno plures: Splitting One Model into an Ensemble of Subnetworks

Monte Carlo (MC) dropout is a simple and efficient ensembling method tha...
research
04/20/2023

Efficient Uncertainty Estimation in Spiking Neural Networks via MC-dropout

Spiking neural networks (SNNs) have gained attention as models of sparse...
research
11/13/2021

MC-CIM: Compute-in-Memory with Monte-Carlo Dropouts for Bayesian Edge Intelligence

We propose MC-CIM, a compute-in-memory (CIM) framework for robust, yet l...
research
07/10/2020

Characteristics of Monte Carlo Dropout in Wide Neural Networks

Monte Carlo (MC) dropout is one of the state-of-the-art approaches for u...

Please sign up or login with your details

Forgot password? Click here to reset