Taming Client Dropout for Distributed Differential Privacy in Federated Learning

09/26/2022
by   Zhifeng Jiang, et al.
0

Federated learning (FL) is increasingly deployed among multiple clients (e.g., mobile devices) to train a shared model over decentralized data. To address the privacy concerns, FL systems need to protect the clients' data from being revealed during training and also control the data leakage through trained models when exposed to untrusted domains. Distributed differential privacy (DP) offers an appealing solution in this regard as it achieves an informed tradeoff between privacy and utility without a trusted server. However, existing distributed DP mechanisms work impractically in real world. For instance, to handle realistic scenarios with client dropout, these existing mechanisms often make strong assumptions about client participation yet still result in either poor privacy guarantees or unsatisfactory training accuracy. We present Hyades, a distributed differentially private FL framework that is highly efficient and resilient to client dropout. First, we develop a new privacy accounting technique under the notion of Renyi DP that tightly bounds the privacy loss in the presence of dropout before client sampling in FL. This enables Hyades to set a minimum target noise level in each training round. Second, we propose a novel 'add-then-remove' masking scheme to enforce this target noise level, even though some sampled clients may still drop out in the end. Third, we design an efficient secure aggregation mechanism that optimally pipelines communication and computation for faster execution. Evaluation through large-scale cloud deployment shows that Hyades can efficiently handle client dropout in various realistic scenarios, attaining the optimal privacy-utility tradeoff and accelerating the training by up to 2.1× compared to existing solutions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset