Wide stochastic networks: Gaussian limit and PAC-Bayesian training

06/17/2021
by   Eugenio Clerico, et al.
0

The limit of infinite width allows for substantial simplifications in the analytical study of overparameterized neural networks. With a suitable random initialization, an extremely large network is well approximated by a Gaussian process, both before and during training. In the present work, we establish a similar result for a simple stochastic architecture whose parameters are random variables. The explicit evaluation of the output distribution allows for a PAC-Bayesian training procedure that directly optimizes the generalization bound. For a large but finite-width network, we show empirically on MNIST that this training approach can outperform standard PAC-Bayesian methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2023

Wide neural networks: From non-gaussian random fields at initialization to the NTK geometry of training

Recent developments in applications of artificial neural networks with o...
research
09/06/2022

A PAC-Bayes bound for deterministic classifiers

We establish a disintegrated PAC-Bayesian bound, for classifiers that ar...
research
10/07/2014

PAC-Bayesian AUC classification and scoring

We develop a scoring and classification procedure based on the PAC-Bayes...
research
06/22/2020

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

We make three related contributions motivated by the challenge of traini...
research
09/25/2019

Asymptotics of Wide Networks from Feynman Diagrams

Understanding the asymptotic behavior of wide networks is of considerabl...
research
10/22/2021

Conditional Gaussian PAC-Bayes

Recent studies have empirically investigated different methods to train ...
research
03/14/2022

Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks

Given any deep fully connected neural network, initialized with random G...

Please sign up or login with your details

Forgot password? Click here to reset