Stochastic Coded Federated Learning with Convergence and Privacy Guarantees

01/25/2022
by   Yuchang Sun, et al.
2

Federated learning (FL) has attracted much attention as a privacy-preserving distributed machine learning framework, where many clients collaboratively train a machine learning model by exchanging model updates with a parameter server instead of sharing their raw data. Nevertheless, FL training suffers from slow convergence and unstable performance due to stragglers caused by the heterogeneous computational resources of clients and fluctuating communication rates. This paper proposes a coded FL framework, namely stochastic coded federated learning (SCFL) to mitigate the straggler issue. In the proposed framework, each client generates a privacy-preserving coded dataset by adding additive noise to the random linear combination of its local data. The server collects the coded datasets from all the clients to construct a composite dataset, which helps to compensate for the straggling effect. In the training process, the server as well as clients perform mini-batch stochastic gradient descent (SGD), and the server adds a make-up term in model aggregation to obtain unbiased gradient estimates. We characterize the privacy guarantee by the mutual information differential privacy (MI-DP) and analyze the convergence performance in federated learning. Besides, we demonstrate a privacy-performance tradeoff of the proposed SCFL method by analyzing the influence of the privacy constraint on the convergence rate. Finally, numerical experiments corroborate our analysis and show the benefits of SCFL in achieving fast convergence while preserving data privacy.

READ FULL TEXT
research
11/08/2022

Stochastic Coded Federated Learning: Theoretical Analysis and Incentive Mechanism Design

Federated learning (FL) has achieved great success as a privacy-preservi...
research
02/17/2022

Federated Stochastic Gradient Descent Begets Self-Induced Momentum

Federated learning (FL) is an emerging machine learning method that can ...
research
04/05/2022

Privacy-Preserving Federated Learning via System Immersion and Random Matrix Encryption

Federated learning (FL) has emerged as a privacy solution for collaborat...
research
04/26/2022

Federated Stochastic Primal-dual Learning with Differential Privacy

Federated learning (FL) is a new paradigm that enables many clients to j...
research
10/26/2021

DPCOVID: Privacy-Preserving Federated Covid-19 Detection

Coronavirus (COVID-19) has shown an unprecedented global crisis by the d...
research
01/17/2023

Graph Topology Learning Under Privacy Constraints

Graph learning, which aims to infer the underlying topology behind high ...
research
06/27/2019

Privacy-Preserving Distributed Learning with Secret Gradient Descent

In many important application domains of machine learning, data is a pri...

Please sign up or login with your details

Forgot password? Click here to reset