Implicit Variational Inference: the Parameter and the Predictor Space

by   Yann Pequignot, et al.

Having access to accurate confidence levels along with the predictions allows to determine whether making a decision is worth the risk. Under the Bayesian paradigm, the posterior distribution over parameters is used to capture model uncertainty, a valuable information that can be translated into predictive uncertainty. However, computing the posterior distribution for high capacity predictors, such as neural networks, is generally intractable, making approximate methods such as variational inference a promising alternative. While most methods perform inference in the space of parameters, we explore the benefits of carrying inference directly in the space of predictors. Relying on a family of distributions given by a deep generative neural network, we present two ways of carrying variational inference: one in parameter space, one in predictor space. Importantly, the latter requires us to choose a distribution of inputs, therefore allowing us at the same time to explicitly address the question of out-of-distribution uncertainty. We explore from various perspectives the implications of working in the predictor space induced by neural networks as opposed to the parameter space, focusing mainly on the quality of uncertainty estimation for data lying outside of the training distribution. We compare posterior approximations obtained with these two methods to several standard methods and present results showing that variational approximations learned in the predictor space distinguish themselves positively from those trained in the parameter space.


page 1

page 2

page 3

page 4


Subspace Inference for Bayesian Deep Learning

Bayesian inference was once a gold standard for learning with neural net...

A new framework for experimental design using Bayesian Evidential Learning: the case of wellhead protection area

In this contribution, we predict the wellhead protection area (WHPA, tar...

Constraining cosmological parameters from N-body simulations with Variational Bayesian Neural Networks

Methods based on Deep Learning have recently been applied on astrophysic...

Sampling-free Variational Inference for Neural Networks with Multiplicative Activation Noise

To adopt neural networks in safety critical domains, knowing whether we ...

GFlowOut: Dropout with Generative Flow Networks

Bayesian Inference offers principled tools to tackle many critical probl...

Excess risk analysis for epistemic uncertainty with application to variational inference

We analyze the epistemic uncertainty (EU) of supervised learning in Baye...

Correlated Parameters to Accurately Measure Uncertainty in Deep Neural Networks

In this article a novel approach for training deep neural networks using...

Please sign up or login with your details

Forgot password? Click here to reset