Differential Private Hogwild! over Distributed Local Data Sets

02/17/2021
by   Marten van Dijk, et al.
0

We consider the Hogwild! setting where clients use local SGD iterations with Gaussian based Differential Privacy (DP) for their own local data sets with the aim of (1) jointly converging to a global model (by interacting at a round to round basis with a centralized server that aggregates local SGD updates into a global model) while (2) keeping each local data set differentially private with respect to the outside world (this includes all other clients who can monitor client-server interactions). We show for a broad class of sample size sequences (this defines the number of local SGD iterations for each round) that a local data set is (ϵ,δ)-DP if the standard deviation σ of the added Gaussian noise per round interaction with the centralized server is at least √(2(ϵ+ ln(1/δ))/ϵ).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2023

DP-BREM: Differentially-Private and Byzantine-Robust Federated Learning with Client Momentum

Federated Learning (FL) allows multiple participating clients to train m...
research
03/08/2023

Considerations on the Theory of Training Models with Differential Privacy

In federated learning collaborative learning takes place by a set of cli...
research
07/19/2021

Renyi Differential Privacy of the Subsampled Shuffle Model in Distributed Learning

We study privacy in a distributed learning framework, where clients coll...
research
04/26/2022

Federated Stochastic Primal-dual Learning with Differential Privacy

Federated learning (FL) is a new paradigm that enables many clients to j...
research
06/07/2022

Shuffled Check-in: Privacy Amplification towards Practical Distributed Learning

Recent studies of distributed computation with formal privacy guarantees...
research
08/21/2020

A(DP)^2SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent with Differential Privacy

As deep learning models are usually massive and complex, distributed lea...
research
05/03/2022

Differentially Private Triangle and 4-Cycle Counting in the Shuffle Model

Subgraph counting is fundamental for analyzing connection patterns or cl...

Please sign up or login with your details

Forgot password? Click here to reset