Federated Graph Representation Learning using Self-Supervision

by   Susheel Suresh, et al.

Federated graph representation learning (FedGRL) brings the benefits of distributed training to graph structured data while simultaneously addressing some privacy and compliance concerns related to data curation. However, several interesting real-world graph data characteristics viz. label deficiency and downstream task heterogeneity are not taken into consideration in current FedGRL setups. In this paper, we consider a realistic and novel problem setting, wherein cross-silo clients have access to vast amounts of unlabeled data with limited or no labeled data and additionally have diverse downstream class label domains. We then propose a novel FedGRL formulation based on model interpolation where we aim to learn a shared global model that is optimized collaboratively using a self-supervised objective and gets downstream task supervision through local client models. We provide a specific instantiation of our general formulation using BGRL a SoTA self-supervised graph representation learning method and we empirically verify its effectiveness through realistic cross-slio datasets: (1) we adapt the Twitch Gamer Network which naturally simulates a cross-geo scenario and show that our formulation can provide consistent and avg. 6.1 objectives and on avg. 1.7 self-supervised training and (2) we construct and introduce a new cross-silo dataset called Amazon Co-purchase Networks that have both the characteristics of the motivated problem setting. And, we witness on avg. 11.5 traditional supervised federated learning and on avg. 1.9 individually trained self-supervised models. Both experimental results point to the effectiveness of our proposed formulation. Finally, both our novel problem setting and dataset contributions provide new avenues for the research in FedGRL.


Federated Self-supervised Learning for Heterogeneous Clients

Federated Learning has become an important learning paradigm due to its ...

Federated Self-Supervised Contrastive Learning via Ensemble Similarity Distillation

This paper investigates the feasibility of learning good representation ...

FedCoCo: A Memory Efficient Federated Self-supervised Framework for On-Device Visual Representation Learning

The ubiquity of edge devices has led to a growing amount of unlabeled da...

FedGL: Federated Graph Learning Framework with Global Self-Supervision

Graph data are ubiquitous in the real world. Graph learning (GL) tries t...

Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout

Large machine learning models trained on diverse data have recently seen...

SelfFed: Self-supervised Federated Learning for Data Heterogeneity and Label Scarcity in IoMT

Self-supervised learning in federated learning paradigm has been gaining...

Bootstrap Representation Learning for Segmentation on Medical Volumes and Sequences

In this work, we propose a novel straightforward method for medical volu...

Please sign up or login with your details

Forgot password? Click here to reset