Design and Analysis of Uplink and Downlink Communications for Federated Learning

12/07/2020
by   Sihui Zheng, et al.
5

Communication has been known to be one of the primary bottlenecks of federated learning (FL), and yet existing studies have not addressed the efficient communication design, particularly in wireless FL where both uplink and downlink communications have to be considered. In this paper, we focus on the design and analysis of physical layer quantization and transmission methods for wireless FL. We answer the question of what and how to communicate between clients and the parameter server and evaluate the impact of the various quantization and transmission options of the updated model on the learning performance. We provide new convergence analysis of the well-known FedAvg under non-i.i.d. dataset distributions, partial clients participation, and finite-precision quantization in uplink and downlink communications. These analyses reveal that, in order to achieve an O(1/T) convergence rate with quantization, transmitting the weight requires increasing the quantization level at a logarithmic rate, while transmitting the weight differential can keep a constant quantization level. Comprehensive numerical evaluation on various real-world datasets reveals that the benefit of a FL-tailored uplink and downlink communication design is enormous - a carefully designed quantization and transmission achieves more than 98 baseline accuracy with fewer than 10 of the experiments on both i.i.d. and non-i.i.d. datasets. In particular, 1-bit quantization (3.1 the floating-point baseline accuracy at almost the same convergence rate on MNIST, representing the best known bandwidth-accuracy tradeoff to the best of the authors' knowledge.

READ FULL TEXT

page 5

page 6

page 8

page 9

page 13

page 15

page 19

page 28

research
03/11/2022

Wireless Quantized Federated Learning: A Joint Computation and Communication Design

Recently, federated learning (FL) has sparked widespread attention as a ...
research
01/06/2021

Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

Does Federated Learning (FL) work when both uplink and downlink communic...
research
07/14/2023

FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout

Federated Learning (FL) emerges as a distributed machine learning paradi...
research
07/14/2023

Improved Convergence Analysis and SNR Control Strategies for Federated Learning in the Presence of Noise

We propose an improved convergence analysis technique that characterizes...
research
07/20/2023

Communication-Efficient Federated Learning over Capacity-Limited Wireless Networks

In this paper, a communication-efficient federated learning (FL) framewo...
research
08/25/2020

Convergence of Federated Learning over a Noisy Downlink

We study federated learning (FL), where power-limited wireless devices u...
research
11/15/2021

On the Tradeoff between Energy, Precision, and Accuracy in Federated Quantized Neural Networks

Deploying federated learning (FL) over wireless networks with resource-c...

Please sign up or login with your details

Forgot password? Click here to reset