Differentially Private Coordinate Descent for Composite Empirical Risk Minimization

10/22/2021
by   Paul Mangold, et al.
0

Machine learning models can leak information about the data used to train them. Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been designed to mitigate this, inducing a trade-off between privacy and utility. In this paper, we propose a new method for composite Differentially Private Empirical Risk Minimization (DP-ERM): Differentially Private proximal Coordinate Descent (DP-CD). We analyze its utility through a novel theoretical analysis of inexact coordinate descent, and highlight some regimes where DP-CD outperforms DP-SGD, thanks to the possibility of using larger step sizes. We also prove new lower bounds for composite DP-ERM under coordinate-wise regularity assumptions, that are, in some settings, nearly matched by our algorithm. In practical implementations, the coordinate-wise nature of DP-CD updates demands special care in choosing the clipping thresholds used to bound individual contributions to the gradients. A natural parameterization of these thresholds emerges from our theory, limiting the addition of unnecessarily large noise without requiring coordinate-wise hyperparameter tuning or extra computational cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2022

High-Dimensional Private Empirical Risk Minimization by Greedy Coordinate Descent

In this paper, we study differentially private empirical risk minimizati...
research
06/12/2020

Differentially Private Stochastic Coordinate Descent

In this paper we tackle the challenge of making the stochastic coordinat...
research
07/09/2021

Differentially private training of neural networks with Langevin dynamics for calibrated predictive uncertainty

We show that differentially private stochastic gradient descent (DP-SGD)...
research
10/04/2022

Recycling Scraps: Improving Private Learning by Leveraging Intermediate Checkpoints

All state-of-the-art (SOTA) differentially private machine learning (DP ...
research
08/09/2021

Efficient Hyperparameter Optimization for Differentially Private Deep Learning

Tuning the hyperparameters in the differentially private stochastic grad...
research
06/19/2020

Differentially Private Variational Autoencoders with Term-wise Gradient Aggregation

This paper studies how to learn variational autoencoders with a variety ...
research
03/29/2017

Efficient Private ERM for Smooth Objectives

In this paper, we consider efficient differentially private empirical ri...

Please sign up or login with your details

Forgot password? Click here to reset