Federated Self-supervised Learning for Heterogeneous Clients

by   Disha Makhija, et al.

Federated Learning has become an important learning paradigm due to its privacy and computational benefits. As the field advances, two key challenges that still remain to be addressed are: (1) system heterogeneity - variability in the compute and/or data resources present on each client, and (2) lack of labeled data in certain federated settings. Several recent developments have tried to overcome these challenges independently. In this work, we propose a unified and systematic framework, Heterogeneous Self-supervised Federated Learning (Hetero-SSFL) for enabling self-supervised learning with federation on heterogeneous clients. The proposed framework allows collaborative representation learning across all the clients without imposing architectural constraints or requiring presence of labeled data. The key idea in Hetero-SSFL is to let each client train its unique self-supervised model and enable the joint learning across clients by aligning the lower dimensional representations on a common dataset. The entire training procedure could be viewed as self and peer-supervised as both the local training and the alignment procedures do not require presence of any labeled data. As in conventional self-supervised learning, the obtained client models are task independent and can be used for varied end-tasks. We provide a convergence guarantee of the proposed framework for non-convex objectives in heterogeneous settings and also empirically demonstrate that our proposed approach outperforms the state of the art methods by a significant margin.


page 1

page 2

page 3

page 4


Architecture Agnostic Federated Learning for Neural Networks

With growing concerns regarding data privacy and rapid increase in data ...

Federated Self-Supervised Contrastive Learning via Ensemble Similarity Distillation

This paper investigates the feasibility of learning good representation ...

Federated Graph Representation Learning using Self-Supervision

Federated graph representation learning (FedGRL) brings the benefits of ...

Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning

To eliminate the requirement of fully-labeled data for supervised model ...

Divergence-aware Federated Self-Supervised Learning

Self-supervised learning (SSL) is capable of learning remarkable represe...

Bring Your Own Data! Self-Supervised Evaluation for Large Language Models

With the rise of Large Language Models (LLMs) and their ubiquitous deplo...

Learning Underrepresented Classes from Decentralized Partially Labeled Medical Images

Using decentralized data for federated training is one promising emergin...

Please sign up or login with your details

Forgot password? Click here to reset