Two Timescale Hybrid Federated Learning with Cooperative D2D Local Model Aggregations

03/18/2021
by   Frank Po-Chen Lin, et al.
16

Federated learning has emerged as a popular technique for distributing machine learning (ML) model training across the wireless edge. In this paper, we propose two timescale hybrid federated learning (TT-HF), which is a hybrid between the device-to-server communication paradigm in federated learning and device-to-device (D2D) communications for model training. In TT-HF, during each global aggregation interval, devices (i) perform multiple stochastic gradient descent iterations on their individual datasets, and (ii) aperiodically engage in consensus formation of their model parameters through cooperative, distributed D2D communications within local clusters. With a new general definition of gradient diversity, we formally study the convergence behavior of TT-HF, resulting in new convergence bounds for distributed ML. We leverage our convergence bounds to develop an adaptive control algorithm that tunes the step size, D2D communication rounds, and global aggregation period of TT-HF over time to target a sublinear convergence rate of O(1/t) while minimizing network resource utilization. Our subsequent experiments demonstrate that TT-HF significantly outperforms the current art in federated learning in terms of model accuracy and/or network energy consumption in different scenarios where local device datasets exhibit statistical heterogeneity.

READ FULL TEXT

page 1

page 14

page 32

page 33

page 35

page 36

page 37

research
09/07/2021

Federated Learning Beyond the Star: Local D2D Model Consensus with Global Cluster Sampling

Federated learning has emerged as a popular technique for distributing m...
research
07/18/2020

Multi-Stage Hybrid Federated Learning over Large-Scale Wireless Fog Networks

One of the popular methods for distributed machine learning (ML) is fede...
research
03/15/2023

Connectivity-Aware Semi-Decentralized Federated Learning over Time-Varying D2D Networks

Semi-decentralized federated learning blends the conventional device to-...
research
02/07/2022

Parallel Successive Learning for Dynamic Distributed Model Training over Heterogeneous Wireless Networks

Federated learning (FedL) has emerged as a popular technique for distrib...
research
01/04/2021

Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation

The conventional federated learning (FedL) architecture distributes mach...
research
11/26/2021

Dynamic Network-Assisted D2D-Aided Coded Distributed Learning

Today, various machine learning (ML) applications offer continuous data ...
research
03/15/2023

Towards Cooperative Federated Learning over Heterogeneous Edge/Fog Networks

Federated learning (FL) has been promoted as a popular technique for tra...

Please sign up or login with your details

Forgot password? Click here to reset