Asynchronous Hierarchical Federated Learning

05/31/2022
by   Xing Wang, et al.
0

Federated Learning is a rapidly growing area of research and with various benefits and industry applications. Typical federated patterns have some intrinsic issues such as heavy server traffic, long periods of convergence, and unreliable accuracy. In this paper, we address these issues by proposing asynchronous hierarchical federated learning, in which the central server uses either the network topology or some clustering algorithm to assign clusters for workers (i.e., client devices). In each cluster, a special aggregator device is selected to enable hierarchical learning, leads to efficient communication between server and workers, so that the burden of the server can be significantly reduced. In addition, asynchronous federated learning schema is used to tolerate heterogeneity of the system and achieve fast convergence, i.e., the server aggregates the gradients from the workers weighted by a staleness parameter to update the global model, and regularized stochastic gradient descent is performed in workers, so that the instability of asynchronous learning can be alleviated. We evaluate the proposed algorithm on CIFAR-10 image classification task, the experimental results demonstrate the effectiveness of asynchronous hierarchical federated learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2019

Communication-Efficient Federated Deep Learning with Asynchronous Model Update and Temporally Weighted Aggregation

Federated learning obtains a central model on the server by aggregating ...
research
11/27/2021

Resource-Aware Asynchronous Online Federated Learning for Nonlinear Regression

Many assumptions in the federated learning literature present a best-cas...
research
06/01/2023

CSMAAFL: Client Scheduling and Model Aggregation in Asynchronous Federated Learning

Asynchronous federated learning aims to solve the straggler problem in h...
research
06/16/2022

Sharper Convergence Guarantees for Asynchronous SGD for Distributed and Federated Learning

We study the asynchronous stochastic gradient descent algorithm for dist...
research
03/02/2020

BASGD: Buffered Asynchronous SGD for Byzantine Learning

Distributed learning has become a hot research topic, due to its wide ap...
research
03/31/2023

Accelerating Wireless Federated Learning via Nesterov's Momentum and Distributed Principle Component Analysis

A wireless federated learning system is investigated by allowing a serve...
research
11/04/2021

TEE-based Selective Testing of Local Workers in Federated Learning Systems

This paper considers a federated learning system composed of a central c...

Please sign up or login with your details

Forgot password? Click here to reset