Differentially-Private Hierarchical Clustering with Provable Approximation Guarantees

01/31/2023
by   Jacob Imola, et al.
0

Hierarchical Clustering is a popular unsupervised machine learning method with decades of history and numerous applications. We initiate the study of differentially private approximation algorithms for hierarchical clustering under the rigorous framework introduced by (Dasgupta, 2016). We show strong lower bounds for the problem: that any ϵ-DP algorithm must exhibit O(|V|^2/ ϵ)-additive error for an input dataset V. Then, we exhibit a polynomial-time approximation algorithm with O(|V|^2.5/ ϵ)-additive error, and an exponential-time algorithm that meets the lower bound. To overcome the lower bound, we focus on the stochastic block model, a popular model of graphs, and, with a separation assumption on the blocks, propose a private 1+o(1) approximation algorithm which also recovers the blocks exactly. Finally, we perform an empirical study of our algorithms and validate their performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset