DeepAI AI Chat
Log In Sign Up

Generative Parameter Sampler For Scalable Uncertainty Quantification

by   Minsuk Shin, et al.
Harvard University

Uncertainty quantification has been a core of the statistical machine learning, but its computational bottleneck has been a serious challenge for both Bayesians and frequentists. We propose a model-based framework in quantifying uncertainty, called predictive-matching Generative Parameter Sampler (GPS). This procedure considers an Uncertainty Quantification (UQ) distribution on the targeted parameter, which is defined as the minimizer of a distance between the empirical distribution and the resulting predictive distribution. This framework adopts a hierarchical modeling perspective such that each observation is modeled by an individual parameter. This individual parameterization permits the resulting inference to be computationally scalable and robust to outliers. Our approach is illustrated for linear models, Poisson processes, and deep neural networks for classification. The results show that the GPS is successful in providing uncertainty quantification as well as additional flexibility beyond what is allowed by classical statistical procedures under the postulated statistical models.


page 6

page 7


Fortuna: A Library for Uncertainty Quantification in Deep Learning

We present Fortuna, an open-source library for uncertainty quantificatio...

Scalable Uncertainty Quantification via GenerativeBootstrap Sampler

It has been believed that the virtue of using statistical procedures is ...

Perturbation-Assisted Sample Synthesis: A Novel Approach for Uncertainty Quantification

This paper introduces a novel generator called Perturbation-Assisted Sam...

Approximate Bayesian inference with queueing networks and coupled jump processes

Queueing networks are systems of theoretical interest that give rise to ...

Morse Neural Networks for Uncertainty Quantification

We introduce a new deep generative model useful for uncertainty quantifi...