Improved Convergence Rates for Non-Convex Federated Learning with Compression

12/07/2020
by   Rudrajit Das, et al.
14

Federated learning is a new distributed learning paradigm that enables efficient training of emerging large-scale machine learning models. In this paper, we consider federated learning on non-convex objectives with compressed communication from the clients to the central server. We propose a novel first-order algorithm (FedSTEPH2) that employs compressed communication and achieves the optimal iteration complexity of 𝒪(1/ϵ^1.5) to reach an ϵ-stationary point (i.e. 𝔼[∇ f(x)^2] ≤ϵ) on smooth non-convex objectives. The proposed scheme is the first algorithm that attains the aforementioned optimal complexity with compressed communication and without using full client gradients at each communication round. The key idea of FedSTEPH2 that enables attaining this optimal complexity is applying judicious momentum terms both in the local client updates and the global server update. As a prequel to FedSTEPH2, we propose FedSTEPH which involves a momentum term only in the local client updates. We establish that FedSTEPH enjoys improved convergence rates under various non-convex settings (such as the Polyak-Łojasiewicz condition) and with fewer assumptions than prior work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2022

Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

We present a theoretical study of server-side optimization in federated ...
research
02/25/2021

Achieving Linear Convergence in Federated Learning under Objective and Systems Heterogeneity

We consider a standard federated learning architecture where a group of ...
research
06/28/2023

Momentum Benefits Non-IID Federated Learning Simply and Provably

Federated learning is a powerful paradigm for large-scale machine learni...
research
05/31/2023

Federated Learning in the Presence of Adversarial Client Unavailability

Federated learning is a decentralized machine learning framework wherein...
research
07/15/2020

FetchSGD: Communication-Efficient Federated Learning with Sketching

Existing approaches to federated learning suffer from a communication bo...
research
03/31/2023

Accelerating Wireless Federated Learning via Nesterov's Momentum and Distributed Principle Component Analysis

A wireless federated learning system is investigated by allowing a serve...
research
11/01/2019

Robust Federated Learning with Noisy Communication

Federated learning is a communication-efficient training process that al...

Please sign up or login with your details

Forgot password? Click here to reset