Empirical Differential Privacy

10/28/2019
by   Paul Burchard, et al.
0

We show how to achieve differential privacy with no or reduced added noise, based on the empirical noise in the data itself. Unlike previous works on noiseless privacy, the empirical viewpoint avoids making any explicit assumptions about the random process generating the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/27/2019

Local Differential Privacy: a tutorial

In the past decade analysis of big data has proven to be extremely valua...
research
10/26/2022

Local Graph-homomorphic Processing for Privatized Distributed Systems

We study the generation of dependent random numbers in a distributed fas...
research
01/11/2022

Achieving Differential Privacy with Matrix Masking in Big Data

Differential privacy schemes have been widely adopted in recent years to...
research
09/24/2014

The Application of Differential Privacy for Rank Aggregation: Privacy and Accuracy

The potential risk of privacy leakage prevents users from sharing their ...
research
03/12/2012

Differential Privacy for Functions and Functional Data

Differential privacy is a framework for privately releasing summaries of...
research
09/13/2023

Communication-Efficient Laplace Mechanism for Differential Privacy via Random Quantization

We propose the first method that realizes the Laplace mechanism exactly ...
research
11/16/2019

Robust Anomaly Detection and Backdoor Attack Detection Via Differential Privacy

Outlier detection and novelty detection are two important topics for ano...

Please sign up or login with your details

Forgot password? Click here to reset