Stochastic Neural Network with Kronecker Flow

by   Chin-Wei Huang, et al.
Element AI Inc
Université de Montréal

Recent advances in variational inference enable the modelling of highly structured joint distributions, but are limited in their capacity to scale to the high-dimensional setting of stochastic neural networks. This limitation motivates a need for scalable parameterizations of the noise generation process, in a manner that adequately captures the dependencies among the various parameters. In this work, we address this need and present the Kronecker Flow, a generalization of the Kronecker product to invertible mappings designed for stochastic neural networks. We apply our method to variational Bayesian neural networks on predictive tasks, PAC-Bayes generalization bound estimation, and approximate Thompson sampling in contextual bandits. In all setups, our methods prove to be competitive with existing methods and better than the baselines.


page 1

page 2

page 3

page 4


Bayesian Hypernetworks

We propose Bayesian hypernetworks: a framework for approximate Bayesian ...

Functional Variational Bayesian Neural Networks

Variational Bayesian neural networks (BNNs) perform variational inferenc...

Do Bayesian Neural Networks Need To Be Fully Stochastic?

We investigate the efficacy of treating all the parameters in a Bayesian...

Fixing Variational Bayes: Deterministic Variational Inference for Bayesian Neural Networks

Bayesian neural networks (BNNs) hold great promise as a flexible and pri...

Excess risk analysis for epistemic uncertainty with application to variational inference

We analyze the epistemic uncertainty (EU) of supervised learning in Baye...

Differentiable PAC-Bayes Objectives with Partially Aggregated Neural Networks

We make three related contributions motivated by the challenge of traini...

Please sign up or login with your details

Forgot password? Click here to reset