Auditing Differential Privacy in High Dimensions with the Kernel Quantum Rényi Divergence

05/27/2022
by   Carles Domingo Enrich, et al.
0

Differential privacy (DP) is the de facto standard for private data release and private machine learning. Auditing black-box DP algorithms and mechanisms to certify whether they satisfy a certain DP guarantee is challenging, especially in high dimension. We propose relaxations of differential privacy based on new divergences on probability distributions: the kernel Rényi divergence and its regularized version. We show that the regularized kernel Rényi divergence can be estimated from samples even in high dimensions, giving rise to auditing procedures for ε-DP, (ε,δ)-DP and (α,ε)-Rényi DP.

READ FULL TEXT
research
11/04/2020

The Limits of Differential Privacy (and its Misuse in Data Release and Machine Learning)

Differential privacy (DP) is a neat privacy definition that can co-exist...
research
03/03/2017

Differentially Private Bayesian Learning on Distributed Data

Many applications of machine learning, for example in health care, would...
research
06/16/2020

A One-Pass Private Sketch for Most Machine Learning Tasks

Differential privacy (DP) is a compelling privacy definition that explai...
research
08/30/2022

On the (Im)Possibility of Estimating Various Notions of Differential Privacy

We analyze to what extent final users can infer information about the le...
research
03/01/2023

Two Views of Constrained Differential Privacy: Belief Revision and Update

In this paper, we provide two views of constrained differential private ...
research
04/03/2022

A Formal Privacy Framework for Partially Private Data

Despite its many useful theoretical properties, differential privacy (DP...
research
07/07/2023

Random Number Generators and Seeding for Differential Privacy

Differential Privacy (DP) relies on random numbers to preserve privacy, ...

Please sign up or login with your details

Forgot password? Click here to reset