FSL: Federated Supermask Learning

10/08/2021
by   Hamid Mozaffari, et al.
1

Federated learning (FL) allows multiple clients with (private) data to collaboratively train a common machine learning model without sharing their private training data. In-the-wild deployment of FL faces two major hurdles: robustness to poisoning attacks and communication efficiency. To address these concurrently, we propose Federated Supermask Learning (FSL). FSL server trains a global subnetwork within a randomly initialized neural network by aggregating local subnetworks of all collaborating clients. FSL clients share local subnetworks in the form of rankings of network edges; more useful edges have higher ranks. By sharing integer rankings, instead of float weights, FSL restricts the space available to craft effective poisoning updates, and by sharing subnetworks, FSL reduces the communication cost of training. We show theoretically and empirically that FSL is robust by design and also significantly communication efficient; all this without compromising clients' privacy. Our experiments demonstrate the superiority of FSL in real-world FL settings; in particular, (1) FSL achieves similar performances as state-of-the-art FedAvg with significantly lower communication costs: for CIFAR10, FSL achieves same performance as Federated Averaging while reducing communication cost by  35 attacks than state-of-the-art robust aggregation algorithms. We have released the code for reproducibility.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2022

To Federate or Not To Federate: Incentivizing Client Participation in Federated Learning

Federated learning (FL) facilitates collaboration between a group of cli...
research
08/13/2023

Approximate and Weighted Data Reconstruction Attack in Federated Learning

Federated Learning (FL) is a distributed learning paradigm that enables ...
research
03/28/2023

Learning Federated Visual Prompt in Null Space for MRI Reconstruction

Federated Magnetic Resonance Imaging (MRI) reconstruction enables multip...
research
05/16/2023

Faster Federated Learning with Decaying Number of Local SGD Steps

In Federated Learning (FL) client devices connected over the internet co...
research
05/15/2023

ESAFL: Efficient Secure Additively Homomorphic Encryption for Cross-Silo Federated Learning

Cross-silo federated learning (FL) enables multiple clients to collabora...
research
08/23/2022

FedMCSA: Personalized Federated Learning via Model Components Self-Attention

Federated learning (FL) facilitates multiple clients to jointly train a ...
research
04/09/2022

Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

Federated learning (FL) scenarios inherently generate a large communicat...

Please sign up or login with your details

Forgot password? Click here to reset