Gradients Look Alike: Sensitivity is Often Overestimated in DP-SGD

07/01/2023
by   Anvith Thudi, et al.
0

Differentially private stochastic gradient descent (DP-SGD) is the canonical algorithm for private deep learning. While it is known that its privacy analysis is tight in the worst-case, several empirical results suggest that when training on common benchmark datasets, the models obtained leak significantly less privacy for many datapoints. In this paper, we develop a new analysis for DP-SGD that captures the intuition that points with similar neighbors in the dataset enjoy better privacy than outliers. Formally, this is done by modifying the per-step privacy analysis of DP-SGD to introduce a dependence on the distribution of model updates computed from a training dataset. We further develop a new composition theorem to effectively use this new per-step analysis to reason about an entire training run. Put all together, our evaluation shows that this novel DP-SGD analysis allows us to now formally show that DP-SGD leaks significantly less privacy for many datapoints. In particular, we observe that correctly classified points obtain better privacy guarantees than misclassified points.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2022

Per-Instance Privacy Accounting for Differentially Private Stochastic Gradient Descent

Differentially private stochastic gradient descent (DP-SGD) is the workh...
research
02/26/2020

On the Effectiveness of Mitigating Data Poisoning Attacks with Gradient Shaping

Machine learning algorithms are vulnerable to data poisoning attacks. Pr...
research
12/12/2022

Generalizing DP-SGD with Shuffling and Batching Clipping

Classical differential private DP-SGD implements individual clipping wit...
research
05/25/2023

DP-SGD Without Clipping: The Lipschitz Neural Network Way

State-of-the-art approaches for training Differentially Private (DP) Dee...
research
02/10/2022

Backpropagation Clipping for Deep Learning with Differential Privacy

We present backpropagation clipping, a novel variant of differentially p...
research
06/14/2020

Differentially Private Decentralized Learning

Decentralized learning has received great attention for its high efficie...
research
12/05/2019

On the Intrinsic Privacy of Stochastic Gradient Descent

Private learning algorithms have been proposed that ensure strong differ...

Please sign up or login with your details

Forgot password? Click here to reset