MURS: Practical and Robust Privacy Amplification with Multi-Party Differential Privacy
When collecting information, local differential privacy (LDP) alleviates privacy concerns of users because their private information is randomized before being sent to the central aggregator. However, LDP results in loss of utility due to the amount of noise that is added to each individual data item. To address this issue, recent work introduced an intermediate server with the assumption that this intermediate server did not collude with the aggregator. Using this trust model, one can add less noise to achieve the same privacy guarantee; thus improving the utility. In this paper, we investigate this multiple-party setting of LDP. We first analyze the threat model and identify potential adversaries. We then make observations about existing approaches and propose new techniques that achieve a better privacy-utility tradeoff than existing ones. Finally, we perform experiments to compare different methods and demonstrate the benefits of using our proposed method.
READ FULL TEXT