Gradient-Leakage Resilient Federated Learning

07/02/2021
by   Wenqi Wei, et al.
0

Federated learning(FL) is an emerging distributed learning paradigm with default client privacy because clients can keep sensitive data on their devices and only share local training parameter updates with the federated server. However, recent studies reveal that gradient leakages in FL may compromise the privacy of client training data. This paper presents a gradient leakage resilient approach to privacy-preserving federated learning with per training example-based client differential privacy, coined as Fed-CDP. It makes three original contributions. First, we identify three types of client gradient leakage threats in federated learning even with encrypted client-server communications. We articulate when and why the conventional server coordinated differential privacy approach, coined as Fed-SDP, is insufficient to protect the privacy of the training data. Second, we introduce Fed-CDP, the per example-based client differential privacy algorithm, and provide a formal analysis of Fed-CDP with the (ϵ, δ) differential privacy guarantee, and a formal comparison between Fed-CDP and Fed-SDP in terms of privacy accounting. Third, we formally analyze the privacy-utility trade-off for providing differential privacy guarantee by Fed-CDP and present a dynamic decay noise-injection policy to further improve the accuracy and resiliency of Fed-CDP. We evaluate and compare Fed-CDP and Fed-CDP(decay) with Fed-SDP in terms of differential privacy guarantee and gradient leakage resilience over five benchmark datasets. The results show that the Fed-CDP approach outperforms conventional Fed-SDP in terms of resilience to client gradient leakages while offering competitive accuracy performance in federated learning.

READ FULL TEXT

page 1

page 2

page 10

research
05/10/2023

Securing Distributed SGD against Gradient Leakage Threats

This paper presents a holistic approach to gradient leakage resilient di...
research
04/22/2020

A Framework for Evaluating Gradient Leakage Attacks in Federated Learning

Federated learning (FL) is an emerging distributed machine learning fram...
research
06/05/2020

LDP-Fed: Federated Learning with Local Differential Privacy

This paper presents LDP-Fed, a novel federated learning system with a fo...
research
05/30/2021

FED-χ^2: Privacy Preserving Federated Correlation Test

In this paper, we propose the first secure federated χ^2-test protocol F...
research
05/02/2023

Efficient Federated Learning with Enhanced Privacy via Lottery Ticket Pruning in Edge Computing

Federated learning (FL) is a collaborative learning paradigm for decentr...
research
02/02/2023

Fed-GLOSS-DP: Federated, Global Learning using Synthetic Sets with Record Level Differential Privacy

This work proposes Fed-GLOSS-DP, a novel approach to privacy-preserving ...
research
06/08/2020

Responsive Web User Interface to Recover Training Data from User Gradients in Federated Learning

Local differential privacy (LDP) is an emerging privacy standard to prot...

Please sign up or login with your details

Forgot password? Click here to reset