A Gaussian Process perspective on Convolutional Neural Networks

10/25/2018
by   Anastasia Borovykh, et al.
0

In this paper we cast the well-known convolutional neural network in a Gaussian process perspective. In this way we hope to gain additional insights into the performance of convolutional networks, in particular understand under what circumstances they tend to perform well and what assumptions are implicitly made in the network. While for feedforward networks the properties of convergence to Gaussian processes have been studied extensively, little is known about situations in which the output from a convolutional network approaches a multivariate normal distribution. In the convolutional net the sum is computed over variables which are not necessarily identically distributed, rendering the general central limit theorem useless. Nevertheless we can apply a Lyapunov-type bound on the distance between the Gaussian process and convolutional network output, and use this bound to study the properties under which the convolutional network behaves approximately like a Gaussian process, so that this behavior -depending on the application- can be either obtained or avoided.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2018

Gaussian Process Behaviour in Wide Deep Neural Networks

Whilst deep neural networks have shown great empirical success, there is...
research
05/18/2023

Posterior Inference on Infinitely Wide Bayesian Neural Networks under Weights with Unbounded Variance

From the classical and influential works of Neal (1996), it is known tha...
research
02/20/2021

Large-width functional asymptotics for deep Gaussian neural networks

In this paper, we consider fully connected feed-forward deep neural netw...
research
03/17/2022

On the Spectral Bias of Convolutional Neural Tangent and Gaussian Process Kernels

We study the properties of various over-parametrized convolutional neura...
research
08/16/2018

Deep Convolutional Networks as shallow Gaussian Processes

We show that the output of a (residual) convolutional neural network (CN...
research
04/04/2021

A Modified Convolutional Network for Auto-encoding based on Pattern Theory Growth Function

This brief paper reports the shortcoming of a variant of convolutional n...
research
10/28/2019

Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes

Wide neural networks with random weights and biases are Gaussian process...

Please sign up or login with your details

Forgot password? Click here to reset