Differentially Private Model Publishing for Deep Learning

04/03/2019
by   Lei Yu, et al.
0

Deep learning techniques based on neural networks have shown significant success in a wide range of AI tasks. Large-scale training datasets are one of the critical factors for their success. However, when the training datasets are crowdsourced from individuals and contain sensitive information, the model parameters may encode private information and bear the risks of privacy leakage. The recent growing trend of the sharing and publishing of pre-trained models further aggravates such privacy risks. To tackle this problem, we propose a differentially private approach for training neural networks. Our approach includes several new techniques for optimizing both privacy loss and model accuracy. We employ a generalization of differential privacy called concentrated differential privacy(CDP), with both a formal and refined privacy loss analysis on two different data batching methods. We implement a dynamic privacy budget allocator over the course of training to improve model accuracy. Extensive experiments demonstrate that our approach effectively improves privacy loss accounting, training efficiency and model quality under a given privacy budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2016

Deep Learning with Differential Privacy

Machine learning techniques based on neural networks are achieving remar...
research
03/02/2021

DPlis: Boosting Utility of Differentially Private Deep Learning via Randomized Smoothing

Deep learning techniques have achieved remarkable performance in wide-ra...
research
03/01/2021

Wide Network Learning with Differential Privacy

Despite intense interest and considerable effort, the current generation...
research
03/18/2021

Super-convergence and Differential Privacy: Training faster with better privacy guarantees

The combination of deep neural networks and Differential Privacy has bee...
research
02/26/2018

Learning Anonymized Representations with Adversarial Neural Networks

Statistical methods protecting sensitive information or the identity of ...
research
04/05/2023

PrivGraph: Differentially Private Graph Data Publication by Exploiting Community Information

Graph data is used in a wide range of applications, while analyzing grap...
research
05/20/2020

InfoScrub: Towards Attribute Privacy by Targeted Obfuscation

Personal photos of individuals when shared online, apart from exhibiting...

Please sign up or login with your details

Forgot password? Click here to reset