FedSel: Federated SGD under Local Differential Privacy with Top-k Dimension Selection

03/24/2020
by   Ruixuan Liu, et al.
0

As massive data are produced from small gadgets, federated learning on mobile devices has become an emerging trend. In the federated setting, Stochastic Gradient Descent (SGD) has been widely used in federated learning for various machine learning models. To prevent privacy leakages from gradients that are calculated on users' sensitive data, local differential privacy (LDP) has been considered as a privacy guarantee in federated SGD recently. However, the existing solutions have a dimension dependency problem: the injected noise is substantially proportional to the dimension d. In this work, we propose a two-stage framework FedSel for federated SGD under LDP to relieve this problem. Our key idea is that not all dimensions are equally important so that we privately select Top-k dimensions according to their contributions in each iteration of federated SGD. Specifically, we propose three private dimension selection mechanisms and adapt the gradient accumulation technique to stabilize the learning process with noisy updates. We also theoretically analyze privacy, accuracy and time complexity of FedSel, which outperforms the state-of-the-art solutions. Experiments on real-world and synthetic datasets verify the effectiveness and efficiency of our framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

Responsive Web User Interface to Recover Training Data from User Gradients in Federated Learning

Local differential privacy (LDP) is an emerging privacy standard to prot...
research
06/08/2020

Attacks to Federated Learning: Responsive Web User Interface to Recover Training Data from User Gradients

Local differential privacy (LDP) is an emerging privacy standard to prot...
research
05/10/2023

Securing Distributed SGD against Gradient Leakage Threats

This paper presents a holistic approach to gradient leakage resilient di...
research
12/10/2021

Federated Two-stage Learning with Sign-based Voting

Federated learning is a distributed machine learning mechanism where loc...
research
04/23/2019

Semi-Cyclic Stochastic Gradient Descent

We consider convex SGD updates with a block-cyclic structure, i.e. where...
research
02/08/2023

Exploratory Analysis of Federated Learning Methods with Differential Privacy on MIMIC-III

Background: Federated learning methods offer the possibility of training...
research
03/28/2020

Differentially Private Federated Learning for Resource-Constrained Internet of Things

With the proliferation of smart devices having built-in sensors, Interne...

Please sign up or login with your details

Forgot password? Click here to reset