Predictive Uncertainty through Quantization

10/12/2018
by   Bastiaan S. Veeling, et al.
0

High-risk domains require reliable confidence estimates from predictive models. Deep latent variable models provide these, but suffer from the rigid variational distributions used for tractable inference, which err on the side of overconfidence. We propose Stochastic Quantized Activation Distributions (SQUAD), which imposes a flexible yet tractable distribution over discretized latent variables. The proposed method is scalable, self-normalizing and sample efficient. We demonstrate that the model fully utilizes the flexible distribution, learns interesting non-linearities, and provides predictive uncertainty of competitive quality.

READ FULL TEXT
research
08/21/2020

Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables

Neural processes (NPs) constitute a family of variational approximate mo...
research
06/12/2019

Neural Variational Inference For Estimating Uncertainty in Knowledge Graph Embeddings

Recent advances in Neural Variational Inference allowed for a renaissanc...
research
06/25/2021

Conjugate Energy-Based Models

In this paper, we propose conjugate energy-based models (CEBMs), a new c...
research
05/05/2023

Sparsifying Bayesian neural networks with latent binary variables and normalizing flows

Artificial neural networks (ANNs) are powerful machine learning methods ...
research
08/03/2023

Quantification of Predictive Uncertainty via Inference-Time Sampling

Predictive variability due to data ambiguities has typically been addres...
research
05/19/2020

A Flexible Stochastic Conditional Duration Model

We introduce a new stochastic duration model for transaction times in as...
research
09/12/2018

Discretely Relaxing Continuous Variables for tractable Variational Inference

We explore a new research direction in Bayesian variational inference wi...

Please sign up or login with your details

Forgot password? Click here to reset