Auditing Differential Privacy in High Dimensions with the Kernel Quantum Rényi Divergence
Differential privacy (DP) is the de facto standard for private data release and private machine learning. Auditing black-box DP algorithms and mechanisms to certify whether they satisfy a certain DP guarantee is challenging, especially in high dimension. We propose relaxations of differential privacy based on new divergences on probability distributions: the kernel Rényi divergence and its regularized version. We show that the regularized kernel Rényi divergence can be estimated from samples even in high dimensions, giving rise to auditing procedures for ε-DP, (ε,δ)-DP and (α,ε)-Rényi DP.
READ FULL TEXT