That which we call private

08/08/2019
by   Úlfar Erlingsson, et al.
0

A casual reader of the study by Jayaraman and Evans in USENIX Security 2019 might conclude that "relaxed definitions of differential privacy" should be avoided, because they "increase the measured privacy leakage." This note clarifies that their study is consistent with a different interpretation. Namely, that the "relaxed definitions" are strict improvements which can improve the epsilon upper-bound guarantees by orders-of-magnitude without changing the actual privacy loss. Practitioners should be careful not to equate real-world privacy with epsilon values, without consideration of their context.

READ FULL TEXT

page 1

page 2

research
02/24/2019

When Relaxations Go Bad: "Differentially-Private" Machine Learning

Differential privacy is becoming a standard notion for performing privac...
research
07/12/2020

A Graph Symmetrisation Bound on Channel Information Leakage under Blowfish Privacy

Blowfish privacy is a recent generalisation of differential privacy that...
research
02/17/2022

Local Differential Privacy for Belief Functions

In this paper, we propose two new definitions of local differential priv...
research
10/19/2020

On Properties and Optimization of Information-theoretic Privacy Watchdog

We study the problem of privacy preservation in data sharing, where S is...
research
02/27/2023

On Differentially Private Online Predictions

In this work we introduce an interactive variant of joint differential p...
research
03/18/2021

Super-convergence and Differential Privacy: Training faster with better privacy guarantees

The combination of deep neural networks and Differential Privacy has bee...
research
05/15/2018

How Private Is Your Voting? A Framework for Comparing the Privacy of Voting Mechanisms

Voting privacy has received a lot of attention across several research c...

Please sign up or login with your details

Forgot password? Click here to reset