Local Differential Privacy Is Equivalent to Contraction of E_γ-Divergence

02/02/2021
by   Shahab Asoodeh, et al.
0

We investigate the local differential privacy (LDP) guarantees of a randomized privacy mechanism via its contraction properties. We first show that LDP constraints can be equivalently cast in terms of the contraction coefficient of the E_γ-divergence. We then use this equivalent formula to express LDP guarantees of privacy mechanisms in terms of contraction coefficients of arbitrary f-divergences. When combined with standard estimation-theoretic tools (such as Le Cam's and Fano's converse methods), this result allows us to study the trade-off between privacy and utility in several testing and minimax and Bayesian estimation problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2020

Privacy Analysis of Online Learning Algorithms via Contraction Coefficients

We propose an information-theoretic technique for analyzing privacy guar...
research
01/17/2020

Privacy Amplification of Iterative Algorithms via Contraction Coefficients

We investigate the framework of privacy amplification by iteration, rece...
research
01/18/2018

On the Contractivity of Privacy Mechanisms

We present a novel way to compare the statistical cost of privacy mechan...
research
09/12/2017

Observational Equivalence in System Estimation: Contractions in Complex Networks

Observability of complex systems/networks is the focus of this paper, wh...
research
10/24/2022

Contraction of Locally Differentially Private Mechanisms

We investigate the contraction properties of locally differentially priv...
research
05/24/2019

Minimax Rates of Estimating Approximate Differential Privacy

Differential privacy has become a widely accepted notion of privacy, lea...
research
07/13/2019

Local Distribution Obfuscation via Probability Coupling

We introduce a general model for the local obfuscation of probability di...

Please sign up or login with your details

Forgot password? Click here to reset