A General Framework for the Derandomization of PAC-Bayesian Bounds

02/17/2021
by   Paul Viallard, et al.
0

PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability of randomized classifiers. However, when applied to some family of deterministic models such as neural networks, they require a loose and costly derandomization step. As an alternative to this step, we introduce three new PAC-Bayesian generalization bounds that have the originality to be pointwise, meaning that they provide guarantees over one single hypothesis instead of the usual averaged analysis. Our bounds are rather general, potentially parameterizable, and provide novel insights for various machine learning settings that rely on randomized algorithms. We illustrate the interest of our theoretical result for the analysis of neural network training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2016

Simpler PAC-Bayesian Bounds for Hostile Data

PAC-Bayesian learning bounds are of the utmost interest to the learning ...
research
02/04/2022

Demystify Optimization and Generalization of Over-parameterized PAC-Bayesian Learning

PAC-Bayesian is an analysis framework where the training error can be ex...
research
05/15/2022

Analyzing Lottery Ticket Hypothesis from PAC-Bayesian Theory Perspective

The lottery ticket hypothesis (LTH) has attracted attention because it c...
research
10/31/2014

Validation of Matching

We introduce a technique to compute probably approximately correct (PAC)...
research
06/16/2020

PAC-Bayesian Generalization Bounds for MultiLayer Perceptrons

We study PAC-Bayesian generalization bounds for Multilayer Perceptrons (...
research
02/19/2021

A PAC-Bayes Analysis of Adversarial Robustness

We propose the first general PAC-Bayesian generalization bounds for adve...
research
10/23/2018

On PAC-Bayesian Bounds for Random Forests

Existing guarantees in terms of rigorous upper bounds on the generalizat...

Please sign up or login with your details

Forgot password? Click here to reset