Optimal Local Bayesian Differential Privacy over Markov Chains

06/22/2022
by   Darshan Chakrabarti, et al.
0

In the literature of data privacy, differential privacy is the most popular model. An algorithm is differentially private if its outputs with and without any individual's data are indistinguishable. In this paper, we focus on data generated from a Markov chain and argue that Bayesian differential privacy (BDP) offers more meaningful guarantees in this context. Our main theoretical contribution is providing a mechanism for achieving BDP when data is drawn from a binary Markov chain. We improve on the state-of-the-art BDP mechanism and show that our mechanism provides the optimal noise-privacy tradeoffs for any local mechanism up to negligible factors. We also briefly discuss a non-local mechanism which adds correlated noise. Lastly, we perform experiments on synthetic data that detail when DP is insufficient, and experiments on real data to show that our privacy guarantees are robust to underlying distributions that are not simple Markov chains.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset