Loss-Calibrated Approximate Inference in Bayesian Neural Networks

05/10/2018
by   Adam D. Cobb, et al.
0

Current approaches in approximate inference for Bayesian neural networks minimise the Kullback-Leibler divergence to approximate the true posterior over the weights. However, this approximation is without knowledge of the final application, and therefore cannot guarantee optimal predictions for a given task. To make more suitable task-specific approximations, we introduce a new loss-calibrated evidence lower bound for Bayesian neural networks in the context of supervised learning, informed by Bayesian decision theory. By introducing a lower bound that depends on a utility function, we ensure that our approximation achieves higher utility than traditional methods for applications that have asymmetric utility functions. Furthermore, in using dropout inference, we highlight that our new objective is identical to that of standard dropout neural networks, with an additional utility-dependent penalty term. We demonstrate our new loss-calibrated model with an illustrative medical example and a restricted model capacity experiment, and highlight failure modes of the comparable weighted cross entropy approach. Lastly, we demonstrate the scalability of our method to real world applications with per-pixel semantic segmentation on an autonomous driving data set.

READ FULL TEXT

page 1

page 9

research
01/10/2022

Loss-calibrated expectation propagation for approximate Bayesian decision-making

Approximate Bayesian inference methods provide a powerful suite of tools...
research
06/15/2022

On Calibrated Model Uncertainty in Deep Learning

Estimated uncertainty by approximate posteriors in Bayesian neural netwo...
research
06/13/2021

Post-hoc loss-calibration for Bayesian neural networks

Bayesian decision theory provides an elegant framework for acting optima...
research
07/05/2018

Variational Bayesian dropout: pitfalls and fixes

Dropout, a stochastic regularisation technique for training of neural ne...
research
08/13/2023

When Monte-Carlo Dropout Meets Multi-Exit: Optimizing Bayesian Neural Networks on FPGA

Bayesian Neural Networks (BayesNNs) have demonstrated their capability o...
research
11/15/2022

On the Performance of Direct Loss Minimization for Bayesian Neural Networks

Direct Loss Minimization (DLM) has been proposed as a pseudo-Bayesian me...
research
03/02/2020

Bayesian Neural Networks With Maximum Mean Discrepancy Regularization

Bayesian Neural Networks (BNNs) are trained to optimize an entire distri...

Please sign up or login with your details

Forgot password? Click here to reset