β-Cores: Robust Large-Scale Bayesian Data Summarization in the Presence of Outliers

08/31/2020
by   Dionysis Manousakas, et al.
10

Modern machine learning applications should be able to address the intrinsic challenges arising over inference on massive real-world datasets, including scalability and robustness to outliers. Despite the multiple benefits of Bayesian methods (such as uncertainty-aware predictions, incorporation of experts knowledge, and hierarchical modeling), the quality of classic Bayesian inference depends critically on whether observations conform with the assumed data generating model, which is impossible to guarantee in practice. In this work, we propose a variational inference method that, in a principled way, can simultaneously scale to large datasets, and robustify the inferred posterior with respect to the existence of outliers in the observed data. Reformulating Bayes theorem via the β-divergence, we posit a robustified pseudo-Bayesian posterior as the target of inference. Moreover, relying on the recent formulations of Riemannian coresets for scalable Bayesian inference, we propose a sparse variational approximation of the robustified posterior and an efficient stochastic black-box algorithm to construct it. Overall our method allows releasing cleansed data summaries that can be applied broadly in scenarios including structured data corruption. We illustrate the applicability of our approach in diverse simulated and real datasets, and various statistical models, including Gaussian mean inference, logistic and neural linear regression, demonstrating its superiority to existing Bayesian summarization methods in the presence of outliers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2017

Variational Inference based on Robust Divergences

Robustness to outliers is a central issue in real-world machine learning...
research
11/04/2022

Black-box Coreset Variational Inference

Recent advances in coreset methods have shown that a selection of repres...
research
05/12/2023

Robustness of Bayesian ordinal response model against outliers via divergence approach

Ordinal response model is a popular and commonly used regression for ord...
research
12/15/2019

Using bagged posteriors for robust inference and model criticism

Standard Bayesian inference is known to be sensitive to model misspecifi...
research
03/13/2022

Median of Means Principle for Bayesian Inference

The topic of robustness is experiencing a resurgence of interest in the ...
research
02/21/2023

Bayesian Inference for Evidence Accumulation Models with Regressors

Evidence accumulation models (EAMs) are an important class of cognitive ...

Please sign up or login with your details

Forgot password? Click here to reset