Delay-Aware Hierarchical Federated Learning

03/22/2023
by   Frank Po-Chen Lin, et al.
0

Federated learning has gained popularity as a means of training models distributed across the wireless edge. The paper introduces delay-aware federated learning (DFL) to improve the efficiency of distributed machine learning (ML) model training by addressing communication delays between edge and cloud. DFL employs multiple stochastic gradient descent iterations on device datasets during each global aggregation interval and intermittently aggregates model parameters through edge servers in local subnetworks. The cloud server synchronizes the local models with the global deployed model computed via a local-global combiner at global synchronization. The convergence behavior of DFL is theoretically investigated under a generalized data heterogeneity metric. A set of conditions is obtained to achieve the sub-linear convergence rate of O(1/k). Based on these findings, an adaptive control algorithm is developed for DFL, implementing policies to mitigate energy consumption and edge-to-cloud communication latency while aiming for a sublinear convergence rate. Numerical evaluations show DFL's superior performance in terms of faster global model convergence, reduced resource consumption, and robustness against communication delays compared to existing FL algorithms. In summary, this proposed method offers improved efficiency and satisfactory results when dealing with both convex and non-convex loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2021

Resource-Efficient and Delay-Aware Federated Learning Design under Edge Heterogeneity

Federated learning (FL) has emerged as a popular technique for distribut...
research
08/21/2020

Federated Learning with Communication Delay in Edge Networks

Federated learning has received significant attention as a potential sol...
research
10/07/2022

Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent

Federated learning (FL) has gained increasing attention recently, which ...
research
08/02/2023

Straggler Mitigation and Latency Optimization in Blockchain-based Hierarchical Federated Learning

Cloud-edge-device hierarchical federated learning (HFL) has been recentl...
research
10/07/2022

Time Minimization in Hierarchical Federated Learning

Federated Learning is a modern decentralized machine learning technique ...
research
03/18/2022

Latency Optimization for Blockchain-Empowered Federated Learning in Multi-Server Edge Computing

In this paper, we study a new latency optimization problem for Blockchai...
research
08/17/2023

Over-the-Air Computation Aided Federated Learning with the Aggregation of Normalized Gradient

Over-the-air computation is a communication-efficient solution for feder...

Please sign up or login with your details

Forgot password? Click here to reset