Preserving Differential Privacy Between Features in Distributed Estimation

03/01/2017
by   Christina Heinze-Deml, et al.
0

Privacy is crucial in many applications of machine learning. Legal, ethical and societal issues restrict the sharing of sensitive data making it difficult to learn from datasets that are partitioned between many parties. One important instance of such a distributed setting arises when information about each record in the dataset is held by different data owners (the design matrix is "vertically-partitioned"). In this setting few approaches exist for private data sharing for the purposes of statistical estimation and the classical setup of differential privacy with a "trusted curator" preparing the data does not apply. We work with the notion of (ϵ,δ)-distributed differential privacy which extends single-party differential privacy to the distributed, vertically-partitioned case. We propose PriDE, a scalable framework for distributed estimation where each party communicates perturbed random projections of their locally held features ensuring (ϵ,δ)-distributed differential privacy is preserved. For ℓ_2-penalized supervised learning problems PriDE has bounded estimation error compared with the optimal estimates obtained without privacy constraints in the non-distributed setting. We confirm this empirically on real world and synthetic datasets.

READ FULL TEXT
research
11/11/2019

Achieving Differential Privacy in Vertically Partitioned Multiparty Learning

Preserving differential privacy has been well studied under centralized ...
research
04/05/2021

Frequency Estimation Under Multiparty Differential Privacy: One-shot and Streaming

We study the fundamental problem of frequency estimation under both priv...
research
10/20/2020

DuetSGX: Differential Privacy with Secure Hardware

Differential privacy offers a formal privacy guarantee for individuals, ...
research
08/17/2021

On the Complexity of Two-Party Differential Privacy

In distributed differential privacy, the parties perform analysis over t...
research
11/08/2019

Privacy-Preserving Generalized Linear Models using Distributed Block Coordinate Descent

Combining data from varied sources has considerable potential for knowle...
research
10/23/2019

Weighted Distributed Differential Privacy ERM: Convex and Non-convex

Distributed machine learning is an approach allowing different parties t...
research
11/02/2019

Adaptive Statistical Learning with Bayesian Differential Privacy

In statistical learning, a dataset is often partitioned into two parts: ...

Please sign up or login with your details

Forgot password? Click here to reset