Hierarchical Quantized Federated Learning: Convergence Analysis and System Design

03/26/2021
by   Lumin Liu, et al.
0

Federated learning is a collaborative machine learning framework to train deep neural networks without accessing clients' private data. Previous works assume one central parameter server either at the cloud or at the edge. A cloud server can aggregate knowledge from all participating clients but suffers high communication overhead and latency, while an edge server enjoys more efficient communications during model update but can only reach a limited number of clients. This paper exploits the advantages of both cloud and edge servers and considers a Hierarchical Quantized Federated Learning (HQFL) system with one cloud server, several edge servers and many clients, adopting a communication-efficient training algorithm, Hier-Local-QSGD. The high communication efficiency comes from frequent local aggregations at the edge servers and fewer aggregations at the cloud server, as well as weight quantization during model uploading. A tight convergence bound for non-convex objective loss functions is derived, which is then applied to investigate two design problems, namely, the accuracy-latency trade-off and edge-client association. It will be shown that given a latency budget for the whole training process, there is an optimal parameter choice with respect to the two aggregation intervals and two quantization levels. For the edge-client association problem, it is found that the edge-client association strategy has no impact on the convergence speed. Empirical simulations shall verify the findings from the convergence analysis and demonstrate the accuracy-latency trade-off in the hierarchical federated learning system.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2023

Timely Asynchronous Hierarchical Federated Learning: Age of Convergence

We consider an asynchronous hierarchical federated learning (AHFL) setti...
research
10/07/2022

Time Minimization in Hierarchical Federated Learning

Federated Learning is a modern decentralized machine learning technique ...
research
12/20/2020

Toward Understanding the Influence of Individual Clients in Federated Learning

Federated learning allows mobile clients to jointly train a global model...
research
08/01/2023

Asynchronous Federated Learning with Bidirectional Quantized Communications and Buffered Aggregation

Asynchronous Federated Learning with Buffered Aggregation (FedBuff) is a...
research
04/25/2021

FedSup: A Communication-Efficient Federated Learning Fatigue Driving Behaviors Supervision Framework

With the proliferation of edge smart devices and the Internet of Vehicle...
research
01/16/2023

HiFlash: Communication-Efficient Hierarchical Federated Learning with Adaptive Staleness Control and Heterogeneity-aware Client-Edge Association

Federated learning (FL) is a promising paradigm that enables collaborati...
research
05/30/2023

Split Federated Learning: Speed up Model Training in Resource-Limited Wireless Networks

In this paper, we propose a novel distributed learning scheme, named gro...

Please sign up or login with your details

Forgot password? Click here to reset