Client Selection for Federated Bayesian Learning

12/11/2022
by   Jiarong Yang, et al.
0

Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametric distributed learning framework for federated Bayesian learning, where multiple clients jointly train a machine learning model by communicating a number of non-random and interacting particles with the server. Since communication resources are limited, selecting the clients with most informative local learning updates can improve the model convergence and communication efficiency. In this paper, we propose two selection schemes for DSVGD based on Kernelized Stein Discrepancy (KSD) and Hilbert Inner Product (HIP). We derive the upper bound on the decrease of the global free energy per iteration for both schemes, which is then minimized to speed up the model convergence. We evaluate and compare our schemes with conventional schemes in terms of model accuracy, convergence speed, and stability using various learning tasks and datasets.

READ FULL TEXT

page 1

page 9

page 14

research
09/11/2020

Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent

This paper introduces Distributed Stein Variational Gradient Descent (DS...
research
11/23/2021

Forget-SVGD: Particle-Based Bayesian Federated Unlearning

Variational particle-based Bayesian learning methods have the advantage ...
research
12/31/2020

Timely Communication in Federated Learning

We consider a federated learning framework in which a parameter server (...
research
10/03/2020

Client Selection in Federated Learning: Convergence Analysis and Power-of-Choice Selection Strategies

Federated learning is a distributed optimization paradigm that enables a...
research
10/28/2022

Federated Learning based Energy Demand Prediction with Clustered Aggregation

To reduce negative environmental impacts, power stations and energy grid...
research
09/14/2021

Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

We consider federated edge learning (FEEL) over wireless fading channels...
research
02/04/2021

Improved Communication Efficiency for Distributed Mean Estimation with Side Information

In this paper, we consider the distributed mean estimation problem where...

Please sign up or login with your details

Forgot password? Click here to reset