Deep Neural Networks as Gaussian Processes

by   Jaehoon Lee, et al.

A deep fully-connected neural network with an i.i.d. prior over its parameters is equivalent to a Gaussian process (GP) in the limit of infinite network width. This correspondence enables exact Bayesian inference for neural networks on regression tasks by means of straightforward matrix computations. For single hidden-layer networks, the covariance function of this GP has long been known. Recently, kernel functions for multi-layer random neural networks have been developed, but only outside of a Bayesian framework. As such, previous work has not identified the correspondence between using these kernels as the covariance function for a GP and performing fully Bayesian prediction with a deep neural network. In this work, we derive this correspondence and develop a computationally efficient pipeline to compute the covariance functions. We then use the resulting GP to perform Bayesian inference for deep neural networks on MNIST and CIFAR-10. We find that the GP-based predictions are competitive and can outperform neural networks trained with stochastic gradient descent. We observe that the trained neural network accuracy approaches that of the corresponding GP-based computation with increasing layer width, and that the GP uncertainty is strongly correlated with prediction error. We connect our observations to the recent development of signal propagation in random neural networks.


page 8

page 13


Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...

On the relationship between multitask neural networks and multitask Gaussian Processes

Despite the effectiveness of multitask deep neural network (MTDNN), ther...

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...

Richer priors for infinitely wide multi-layer perceptrons

It is well-known that the distribution over functions induced through a ...

Exact posterior distributions of wide Bayesian neural networks

Recent work has shown that the prior over functions induced by a deep Ba...

Attentive Gaussian processes for probabilistic time-series generation

The transduction of sequence has been mostly done by recurrent networks,...

Neural-net-induced Gaussian process regression for function approximation and PDE solution

Neural-net-induced Gaussian process (NNGP) regression inherits both the ...

Please sign up or login with your details

Forgot password? Click here to reset