PAC-Bayesian Theory Meets Bayesian Inference

by   Pascal Germain, et al.

We exhibit a strong link between frequentist PAC-Bayesian risk bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization risk bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks.


page 1

page 2

page 3

page 4


PAC-Bayesian Bounds for Deep Gaussian Processes

Variational approximation techniques and inference for stochastic models...

General Bayesian L^2 calibration of mathematical models

A general Bayesian method for L^2 calibration of a mathematical model is...

PACMAN: PAC-style bounds accounting for the Mismatch between Accuracy and Negative log-loss

The ultimate performance of machine learning algorithms for classificati...

A unified PAC-Bayesian framework for machine unlearning via information risk minimization

Machine unlearning refers to mechanisms that can remove the influence of...

Unifying Variational Inference and PAC-Bayes for Supervised Learning that Scales

Neural Network based controllers hold enormous potential to learn comple...

A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity

We present a novel notion of complexity that interpolates between and ge...

Learning Partially Known Stochastic Dynamics with Empirical PAC Bayes

We propose a novel scheme for fitting heavily parameterized non-linear s...

Code Repositories


Code to related to my NIPS 2016 paper

view repo

Please sign up or login with your details

Forgot password? Click here to reset