Federated Bayesian Computation via Piecewise Deterministic Markov Processes

10/25/2022
by   Joris Bierkens, et al.
0

When performing Bayesian computations in practice, one is often faced with the challenge that the constituent model components and/or the data are only available in a distributed fashion, e.g. due to privacy concerns or sheer volume. While various methods have been proposed for performing posterior inference in such federated settings, these either make very strong assumptions on the data and/or model or otherwise introduce significant bias when the local posteriors are combined to form an approximation of the target posterior. By leveraging recently developed methods for Markov Chain Monte Carlo (MCMC) based on Piecewise Deterministic Markov Processes (PDMPs), we develop a computation – and communication – efficient family of posterior inference algorithms (Fed-PDMC) which provides asymptotically exact approximations of the full posterior over a large class of Bayesian models, allowing heterogenous model and data contributions from each client. We show that communication between clients and the server preserves the privacy of the individual data sources by establishing differential privacy guarantees. We quantify the performance of Fed-PDMC over a class of illustrative analytical case-studies and demonstrate its efficacy on a number of synthetic examples along with realistic Bayesian computation benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2019

Cores for Piecewise-Deterministic Markov Processes used in Markov Chain Monte Carlo

This article provides a tool for the analysis of Piecewise-deterministic...
research
04/08/2020

Posterior computation with the Gibbs zig-zag sampler

Markov chain Monte Carlo (MCMC) sampling algorithms have dominated the l...
research
07/02/2018

A Piecewise Deterministic Markov Process via (r,θ) swaps in hyperspherical coordinates

Recently, a class of stochastic processes known as piecewise determinist...
research
02/17/2023

Piecewise Deterministic Markov Processes for Bayesian Neural Networks

Inference on modern Bayesian Neural Networks (BNNs) often relies on a va...
research
08/17/2021

Wireless Federated Langevin Monte Carlo: Repurposing Channel Noise for Bayesian Sampling and Privacy

Most works on federated learning (FL) focus on the most common frequenti...
research
03/16/2021

Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-Differentiable Priors

The use of non-differentiable priors in Bayesian statistics has become i...
research
06/01/2021

QLSD: Quantised Langevin stochastic dynamics for Bayesian federated learning

Federated learning aims at conducting inference when data are decentrali...

Please sign up or login with your details

Forgot password? Click here to reset