Double Momentum SGD for Federated Learning

02/08/2021
by   An Xu, et al.
0

Communication efficiency is crucial in federated learning. Conducting many local training steps in clients to reduce the communication frequency between clients and the server is a common method to address this issue. However, the client drift problem arises as the non-i.i.d. data distributions in different clients can severely deteriorate the performance of federated learning. In this work, we propose a new SGD variant named as DOMO to improve the model performance in federated learning, where double momentum buffers are maintained. One momentum buffer tracks the server update direction, while the other tracks the local update direction. We introduce a novel server momentum fusion technique to coordinate the server and local momentum SGD. We also provide the first theoretical analysis involving both the server and local momentum SGD. Extensive experimental results show a better model performance of DOMO than FedAvg and existing momentum SGD variants in federated learning tasks.

READ FULL TEXT
research
08/08/2020

Mime: Mimicking Centralized Stochastic Algorithms in Federated Learning

Federated learning is a challenging optimization problem due to the hete...
research
06/21/2021

FedCM: Federated Learning with Client-level Momentum

Federated Learning is a distributed machine learning approach which enab...
research
06/19/2021

STEM: A Stochastic Two-Sided Momentum Algorithm Achieving Near-Optimal Sample and Communication Complexities for Federated Learning

Federated Learning (FL) refers to the paradigm where multiple worker nod...
research
12/03/2020

Robust Federated Learning with Noisy Labels

Federated learning is a paradigm that enables local devices to jointly t...
research
06/28/2023

Momentum Benefits Non-IID Federated Learning Simply and Provably

Federated learning is a powerful paradigm for large-scale machine learni...
research
07/15/2020

FetchSGD: Communication-Efficient Federated Learning with Sketching

Existing approaches to federated learning suffer from a communication bo...
research
04/28/2022

Improving the Robustness of Federated Learning for Severely Imbalanced Datasets

With the ever increasing data deluge and the success of deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset