Wireless Federated Langevin Monte Carlo: Repurposing Channel Noise for Bayesian Sampling and Privacy

08/17/2021
by   Dongzhu Liu, et al.
0

Most works on federated learning (FL) focus on the most common frequentist formulation of learning whereby the goal is minimizing the global empirical loss. Frequentist learning, however, is known to be problematic in the regime of limited data as it fails to quantify epistemic uncertainty in prediction. Bayesian learning provides a principled solution to this problem by shifting the optimization domain to the space of distribution in the model parameters. This paper studies for the first time Bayesian FL in wireless systems by proposing and analyzing a gradient-based Markov Chain Monte Carlo (MCMC) method – Wireless Federated Langevin Monte Carlo (WFLMC). The key idea of this work is to repurpose channel noise for the double role of seed randomness for MCMC sampling and of privacy-preserving mechanism. To this end, based on the analysis of the Wasserstein distance between sample distribution and global posterior distribution under privacy and power constraints, we introduce a power allocation strategy as the solution of a convex program. The analysis identifies distinct operating regimes in which the performance of the system is power-limited, privacy-limited, or limited by the requirement of MCMC sampling. Both analytical and simulation results demonstrate that, if the channel noise is properly accounted for under suitable conditions, it can be fully repurposed for both MCMC sampling and privacy preservation, obtaining the same performance as in an ideal communication setting that is not subject to privacy constraints.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2022

Leveraging Channel Noise for Sampling and Privacy via Quantized Federated Langevin Monte Carlo

For engineering applications of artificial intelligence, Bayesian learni...
research
03/01/2021

Channel-Driven Monte Carlo Sampling for Bayesian Distributed Learning in Wireless Data Centers

Conventional frequentist learning, as assumed by existing federated lear...
research
05/07/2023

Bayesian Over-the-Air FedAvg via Channel Driven Stochastic Gradient Langevin Dynamics

The recent development of scalable Bayesian inference methods has renewe...
research
11/15/2021

Power Allocation for Wireless Federated Learning using Graph Neural Networks

We propose a data-driven approach for power allocation in the context of...
research
02/21/2021

Privacy-Preserving Wireless Federated Learning Exploiting Inherent Hardware Impairments

We consider a wireless federated learning system where multiple data hol...
research
10/25/2022

Federated Bayesian Computation via Piecewise Deterministic Markov Processes

When performing Bayesian computations in practice, one is often faced wi...
research
04/25/2019

Parametric Scenario Optimization under Limited Data: A Distributionally Robust Optimization View

We consider optimization problems with uncertain constraints that need t...

Please sign up or login with your details

Forgot password? Click here to reset