Uncertainty Propagation in Convolutional Neural Networks: Technical Report

02/11/2021
by   Christos Tzelepis, et al.
0

In this technical report we study the problem of propagation of uncertainty (in terms of variances of given uni-variate normal random variables) through typical building blocks of a Convolutional Neural Network (CNN). These include layers that perform linear operations, such as 2D convolutions, fully-connected, and average pooling layers, as well as layers that act non-linearly on their input, such as the Rectified Linear Unit (ReLU). Finally, we discuss the sigmoid function, for which we give approximations of its first- and second-order moments, as well as the binary cross-entropy loss function, for which we approximate its expected value under normal random inputs.

READ FULL TEXT

page 3

page 4

research
12/30/2017

PAC-Bayesian Margin Bounds for Convolutional Neural Networks - Technical Report

Recently the generalisation error of deep neural networks has been analy...
research
10/21/2020

Voronoi Convolutional Neural Networks

In this technical report, we investigate extending convolutional neural ...
research
04/07/2020

The relationship between Fully Connected Layers and number of classes for the analysis of retinal images

This paper experiments with the number of fully-connected layers in a de...
research
08/06/2020

Structured Convolutions for Efficient Neural Network Design

In this work, we tackle model efficiency by exploiting redundancy in the...
research
10/02/2018

GINN: Geometric Illustration of Neural Networks

This informal technical report details the geometric illustration of dec...
research
10/01/2009

Expectation Propagation on the Maximum of Correlated Normal Variables

Many inference problems involving questions of optimality ask for the ma...
research
11/07/2016

Hierarchical compositional feature learning

We introduce the hierarchical compositional network (HCN), a directed ge...

Please sign up or login with your details

Forgot password? Click here to reset