DP-Adam: Correcting DP Bias in Adam's Second Moment Estimation

04/21/2023
by   Qiaoyue Tang, et al.
0

We observe that the traditional use of DP with the Adam optimizer introduces a bias in the second moment estimation, due to the addition of independent noise in the gradient computation. This bias leads to a different scaling for low variance parameter updates, that is inconsistent with the behavior of non-private Adam, and Adam's sign descent interpretation. Empirically, correcting the bias introduced by DP noise significantly improves the optimization performance of DP-Adam.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2021

DP-FP: Differentially Private Forward Propagation for Large Models

When applied to large-scale learning problems, the conventional wisdom o...
research
03/16/2022

Differentiable DAG Sampling

We propose a new differentiable probabilistic model over DAGs (DP-DAG). ...
research
06/27/2022

Normalized/Clipped SGD with Perturbation for Differentially Private Non-Convex Optimization

By ensuring differential privacy in the learning algorithms, one can rig...
research
10/20/2021

AdamD: Improved bias-correction in Adam

Here I present a small update to the bias-correction term in the Adam op...
research
04/14/2023

DeePoint: Pointing Recognition and Direction Estimation From A Fixed View

In this paper, we realize automatic visual recognition and direction est...
research
10/28/2022

DPVIm: Differentially Private Variational Inference Improved

Differentially private (DP) release of multidimensional statistics typic...
research
12/07/2021

Scaling Structured Inference with Randomization

Deep discrete structured models have seen considerable progress recently...

Please sign up or login with your details

Forgot password? Click here to reset