FedHeN: Federated Learning in Heterogeneous Networks

07/07/2022
by   Durmus Alp Emre Acar, et al.
0

We propose a novel training recipe for federated learning with heterogeneous networks where each device can have different architectures. We introduce training with a side objective to the devices of higher complexities to jointly train different architectures in a federated setting. We empirically show that our approach improves the performance of different architectures and leads to high communication savings compared to the state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2019

Fair Resource Allocation in Federated Learning

Federated learning involves training statistical models in massive, hete...
research
08/20/2021

FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

Federated learning aims to protect users' privacy while performing data ...
research
10/23/2021

Federated Multiple Label Hashing (FedMLH): Communication Efficient Federated Learning on Extreme Classification Tasks

Federated learning enables many local devices to train a deep learning m...
research
12/01/2021

Compare Where It Matters: Using Layer-Wise Regularization To Improve Federated Learning on Heterogeneous Data

Federated Learning is a widely adopted method to train neural networks o...
research
06/09/2020

Distributed Learning on Heterogeneous Resource-Constrained Devices

We consider a distributed system, consisting of a heterogeneous set of d...
research
02/25/2020

Device Heterogeneity in Federated Learning: A Superquantile Approach

We propose a federated learning framework to handle heterogeneous client...
research
02/15/2020

Federated Learning with Matched Averaging

Federated learning allows edge devices to collaboratively learn a shared...

Please sign up or login with your details

Forgot password? Click here to reset