pMSE Mechanism: Differentially Private Synthetic Data with Maximal Distributional Similarity

05/23/2018
by   Joshua Snoke, et al.
0

We propose a method for the release of differentially private synthetic datasets. In many contexts, data contain sensitive values which cannot be released in their original form in order to protect individuals' privacy. Synthetic data is a protection method that releases alternative values in place of the original ones, and differential privacy (DP) is a formal guarantee for quantifying the privacy loss. We propose a method that maximizes the distributional similarity of the synthetic data relative to the original data using a measure known as the pMSE, while guaranteeing epsilon-differential privacy. Additionally, we relax common DP assumptions concerning the distribution and boundedness of the original data. We prove theoretical results for the privacy guarantee and provide simulations for the empirical failure rate of the theoretical results under typical computational limitations. We also give simulations for the accuracy of linear regression coefficients generated from the synthetic data compared with the accuracy of non-differentially private synthetic data and other differentially private methods. Additionally, our theoretical results extend a prior result for the sensitivity of the Gini Index to include continuous predictors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset