Differentially Private Learning with Per-Sample Adaptive Clipping

12/01/2022
by   Tianyu Xia, et al.
0

Privacy in AI remains a topic that draws attention from researchers and the general public in recent years. As one way to implement privacy-preserving AI, differentially private learning is a framework that enables AI models to use differential privacy (DP). To achieve DP in the learning process, existing algorithms typically limit the magnitude of gradients with a constant clipping, which requires carefully tuned due to its significant impact on model performance. As a solution to this issue, latest works NSGD and Auto-S innovatively propose to use normalization instead of clipping to avoid hyperparameter tuning. However, normalization-based approaches like NSGD and Auto-S rely on a monotonic weight function, which imposes excessive weight on small gradient samples and introduces extra deviation to the update. In this paper, we propose a Differentially Private Per-Sample Adaptive Clipping (DP-PSAC) algorithm based on a non-monotonic adaptive weight function, which guarantees privacy without the typical hyperparameter tuning process of using a constant clipping while significantly reducing the deviation between the update and true batch-averaged gradient. We provide a rigorous theoretical convergence analysis and show that with convergence rate at the same order, the proposed algorithm achieves a lower non-vanishing bound, which is maintained over training iterations, compared with NSGD/Auto-S. In addition, through extensive experimental evaluation, we show that DP-PSAC outperforms or matches the state-of-the-art methods on multiple main-stream vision and language tasks.

READ FULL TEXT
research
01/27/2023

Practical Differentially Private Hyperparameter Tuning with Subsampling

Tuning all the hyperparameters of differentially private (DP) machine le...
research
06/13/2023

Safeguarding Data in Multimodal AI: A Differentially Private Approach to CLIP Training

The surge in multimodal AI's success has sparked concerns over data priv...
research
12/01/2022

Differentially Private Adaptive Optimization with Delayed Preconditioners

Privacy noise may negate the benefits of using adaptive optimizers in di...
research
09/30/2022

Kernel Normalized Convolutional Networks for Privacy-Preserving Machine Learning

Normalization is an important but understudied challenge in privacy-rela...
research
06/09/2023

DP-HyPO: An Adaptive Private Hyperparameter Optimization Framework

Hyperparameter optimization, also known as hyperparameter tuning, is a w...
research
05/09/2019

Differentially Private Learning with Adaptive Clipping

We introduce a new adaptive clipping technique for training learning mod...
research
07/04/2020

RDP-GAN: A Rényi-Differential Privacy based Generative Adversarial Network

Generative adversarial network (GAN) has attracted increasing attention ...

Please sign up or login with your details

Forgot password? Click here to reset