A general framework for ensemble distribution distillation

02/26/2020
by   Jakob Lindqvist, et al.
11

Ensembles of neural networks have been shown to give better performance than single networks, both in terms of predictions and uncertainty estimation. Additionally, ensembles allow the uncertainty to be decomposed into aleatoric (data) and epistemic (model) components, giving a more complete picture of the predictive uncertainty. Ensemble distillation is the process of compressing an ensemble into a single model, often resulting in a leaner model that still outperforms the individual ensemble members. Unfortunately, standard distillation erases the natural uncertainty decomposition of the ensemble. We present a general framework for distilling both regression and classification ensembles in a way that preserves the decomposition. We demonstrate the desired behaviour of our framework and show that its predictive performance is on par with standard distillation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Hydra: Preserving Ensemble Diversity for Model Distillation

Ensembles of models have been empirically shown to improve predictive pe...
research
03/15/2022

Self-Distribution Distillation: Efficient Uncertainty Estimation

Deep learning is increasingly being applied in safety-critical domains. ...
research
06/25/2020

Fast, Accurate, and Simple Models for Tabular Data via Augmented Distillation

Automated machine learning (AutoML) can produce complex model ensembles ...
research
03/17/2023

DUDES: Deep Uncertainty Distillation using Ensembles for Semantic Segmentation

Deep neural networks lack interpretability and tend to be overconfident,...
research
09/20/2023

You can have your ensemble and run it too – Deep Ensembles Spread Over Time

Ensembles of independently trained deep neural networks yield uncertaint...
research
05/17/2023

Logit-Based Ensemble Distribution Distillation for Robust Autoregressive Sequence Uncertainties

Efficiently and reliably estimating uncertainty is an important objectiv...
research
10/19/2020

Anti-Distillation: Improving reproducibility of deep networks

Deep networks have been revolutionary in improving performance of machin...

Please sign up or login with your details

Forgot password? Click here to reset