Differential Privacy Meets Neural Network Pruning

03/08/2023
by   Kamil Adamczewski, et al.
0

A major challenge in applying differential privacy to training deep neural network models is scalability.The widely-used training algorithm, differentially private stochastic gradient descent (DP-SGD), struggles with training moderately-sized neural network models for a value of epsilon corresponding to a high level of privacy protection. In this paper, we explore the idea of dimensionality reduction inspired by neural network pruning to improve the scalability of DP-SGD. We study the interplay between neural network pruning and differential privacy, through the two modes of parameter updates. We call the first mode, parameter freezing, where we pre-prune the network and only update the remaining parameters using DP-SGD. We call the second mode, parameter selection, where we select which parameters to update at each step of training and update only those selected using DP-SGD. In these modes, we use public data for freezing or selecting parameters to avoid privacy loss incurring in these steps. Naturally, the closeness between the private and public data plays an important role in the success of this paradigm. Our experimental results demonstrate how decreasing the parameter space improves differentially private training. Moreover, by studying two popular forms of pruning which do not rely on gradients and do not incur an additional privacy loss, we show that random selection performs on par with magnitude-based selection when it comes to DP-SGD training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2023

Pre-Pruning and Gradient-Dropping Improve Differentially Private Image Classification

Scalability is a significant challenge when it comes to applying differe...
research
10/07/2022

Differentially Private Deep Learning with ModelMix

Training large neural networks with meaningful/usable differential priva...
research
03/04/2020

Privacy-preserving Learning via Deep Net Pruning

This paper attempts to answer the question whether neural network prunin...
research
05/24/2022

DPSNN: A Differentially Private Spiking Neural Network

Privacy-preserving is a key problem for the machine learning algorithm. ...
research
10/07/2022

TAN without a burn: Scaling Laws of DP-SGD

Differentially Private methods for training Deep Neural Networks (DNNs) ...
research
06/27/2023

Differentially Private Video Activity Recognition

In recent years, differential privacy has seen significant advancements ...
research
01/11/2022

Feature Space Hijacking Attacks against Differentially Private Split Learning

Split learning and differential privacy are technologies with growing po...

Please sign up or login with your details

Forgot password? Click here to reset