Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps

03/29/2021
by   Janardhan Kulkarni, et al.
0

We study the differentially private Empirical Risk Minimization (ERM) and Stochastic Convex Optimization (SCO) problems for non-smooth convex functions. We get a (nearly) optimal bound on the excess empirical risk and excess population loss with subquadratic gradient complexity. More precisely, our differentially private algorithm requires O(N^3/2/d^1/8+ N^2/d) gradient queries for optimal excess empirical risk, which is achieved with the help of subsampling and smoothing the function via convolution. This is the first subquadratic algorithm for the non-smooth case when d is super constant. As a direct application, using the iterative localization approach of Feldman et al. <cit.>, we achieve the optimal excess population loss for stochastic convex optimization problem, with O(min{N^5/4d^1/8, N^3/2/d^1/8}) gradient queries. Our work makes progress towards resolving a question raised by Bassily et al. <cit.>, giving first algorithms for private ERM and SCO with subquadratic steps. We note that independently Asi et al. <cit.> gave other algorithms for private ERM and SCO with subquadratic steps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2020

Private Stochastic Convex Optimization: Optimal Rates in Linear Time

We study differentially private (DP) algorithms for stochastic convex op...
research
02/14/2018

Differentially Private Empirical Risk Minimization Revisited: Faster and More General

In this paper we study the differentially private Empirical Risk Minimiz...
research
01/01/2023

ReSQueing Parallel and Private Stochastic Convex Optimization

We introduce a new tool for stochastic convex optimization (SCO): a Rewe...
research
01/21/2020

SA vs SAA for population Wasserstein barycenter calculation

In Machine Learning and Optimization community there are two main approa...
research
10/12/2022

Differentially Private Online-to-Batch for Smooth Losses

We develop a new reduction that converts any online convex optimization ...
research
10/22/2021

Tight and Robust Private Mean Estimation with Few Users

In this work, we study high-dimensional mean estimation under user-level...
research
08/05/2021

Adapting to Function Difficulty and Growth Conditions in Private Optimization

We develop algorithms for private stochastic convex optimization that ad...

Please sign up or login with your details

Forgot password? Click here to reset