Training Differentially Private Graph Neural Networks with Random Walk Sampling

01/02/2023
by   Morgane Ayle, et al.
0

Deep learning models are known to put the privacy of their training data at risk, which poses challenges for their safe and ethical release to the public. Differentially private stochastic gradient descent is the de facto standard for training neural networks without leaking sensitive information about the training data. However, applying it to models for graph-structured data poses a novel challenge: unlike with i.i.d. data, sensitive information about a node in a graph cannot only leak through its gradients, but also through the gradients of all nodes within a larger neighborhood. In practice, this limits privacy-preserving deep learning on graphs to very shallow graph neural networks. We propose to solve this issue by training graph neural networks on disjoint subgraphs of a given training graph. We develop three random-walk-based methods for generating such disjoint subgraphs and perform a careful analysis of the data-generating distributions to provide strong privacy guarantees. Through extensive experiments, we show that our method greatly outperforms the state-of-the-art baseline on three large graphs, and matches or outperforms it on four smaller ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2021

Node-Level Differentially Private Graph Neural Networks

Graph Neural Networks (GNNs) are a popular technique for modelling graph...
research
02/05/2022

Differentially Private Graph Classification with GNNs

Graph Neural Networks (GNNs) have established themselves as the state-of...
research
04/18/2023

ProGAP: Progressive Graph Neural Networks with Differential Privacy Guarantees

Graph Neural Networks (GNNs) have become a popular tool for learning on ...
research
09/03/2020

Private Weighted Random Walk Stochastic Gradient Descent

We consider a decentralized learning setting in which data is distribute...
research
08/09/2023

Differentially Private Graph Neural Network with Importance-Grained Noise Adaption

Graph Neural Networks (GNNs) with differential privacy have been propose...
research
05/18/2020

Graphs, Entities, and Step Mixture

Existing approaches for graph neural networks commonly suffer from the o...
research
07/06/2022

Scaling Private Deep Learning with Low-Rank and Sparse Gradients

Applying Differentially Private Stochastic Gradient Descent (DPSGD) to t...

Please sign up or login with your details

Forgot password? Click here to reset