DeepAI AI Chat
Log In Sign Up

Marginal and Joint Cross-Entropies Predictives for Online Bayesian Inference, Active Learning, and Active Sampling

by   Andreas Kirsch, et al.
University of Oxford

Principled Bayesian deep learning (BDL) does not live up to its potential when we only focus on marginal predictive distributions (marginal predictives). Recent works have highlighted the importance of joint predictives for (Bayesian) sequential decision making from a theoretical and synthetic perspective. We provide additional practical arguments grounded in real-world applications for focusing on joint predictives: we discuss online Bayesian inference, which would allow us to make predictions while taking into account additional data without retraining, and we propose new challenging evaluation settings using active learning and active sampling. These settings are motivated by an examination of marginal and joint predictives, their respective cross-entropies, and their place in offline and online learning. They are more realistic than previously suggested ones, building on work by Wen et al. (2021) and Osband et al. (2022), and focus on evaluating the performance of approximate BNNs in an online supervised setting. Initial experiments, however, raise questions on the feasibility of these ideas in high-dimensional parameter spaces with current BDL inference techniques, and we suggest experiments that might help shed further light on the practicality of current research for these problems. Importantly, our work highlights previously unidentified gaps in current research and the need for better approximate joint predictives.


page 1

page 2

page 3

page 4


Marginal sequential Monte Carlo for doubly intractable models

Bayesian inference for models that have an intractable partition functio...

Evaluating Predictive Distributions: Does Bayesian Deep Learning Work?

Posterior predictive distributions quantify uncertainties ignored by poi...

Understanding Approximation for Bayesian Inference in Neural Networks

Bayesian inference has theoretical attractions as a principled framework...

A category theory framework for Bayesian learning

Inspired by the foundational works by Spivak and Fong and Cruttwell et a...

Adaptive quadrature schemes for Bayesian inference via active learning

Numerical integration and emulation are fundamental topics across scient...

Barker's algorithm for Bayesian inference with intractable likelihoods

In this expository paper we abstract and describe a simple MCMC scheme f...

Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system

We consider the application of active subspaces to inform a Metropolis-H...