FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning

by   Yuanhao Xiong, et al.
The Hong Kong University of Science and Technology

Federated learning (FL) has recently attracted increasing attention from academia and industry, with the ultimate goal of achieving collaborative training under privacy and communication constraints. Existing iterative model averaging based FL algorithms require a large number of communication rounds to obtain a well-performed model due to extremely unbalanced and non-i.i.d data partitioning among different clients. Thus, we propose FedDM to build the global training objective from multiple local surrogate functions, which enables the server to gain a more global view of the loss landscape. In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data through distribution matching. FedDM reduces communication rounds and improves model quality by transmitting more informative and smaller synthesized data compared with unwieldy model weights. We conduct extensive experiments on three image classification datasets, and results show that our method can outperform other FL counterparts in terms of efficiency and model performance. Moreover, we demonstrate that FedDM can be adapted to preserve differential privacy with Gaussian mechanism and train a better model under the same privacy budget.


page 17

page 18


FedProf: Optimizing Federated Learning with Dynamic Data Profiling

Federated Learning (FL) has shown great potential as a privacy-preservin...

H-FL: A Hierarchical Communication-Efficient and Privacy-Protected Architecture for Federated Learning

The longstanding goals of federated learning (FL) require rigorous priva...

Improving Privacy-Preserving Vertical Federated Learning by Efficient Communication with ADMM

Federated learning (FL) enables distributed devices to jointly train a s...

Dynamic Attention-based Communication-Efficient Federated Learning

Federated learning (FL) offers a solution to train a global machine lear...

Gradient-less Federated Gradient Boosting Trees with Learnable Learning Rates

The privacy-sensitive nature of decentralized datasets and the robustnes...

An Efficient Virtual Data Generation Method for Reducing Communication in Federated Learning

Communication overhead is one of the major challenges in Federated Learn...

Federated Learning with Neural Graphical Models

Federated Learning (FL) addresses the need to create models based on pro...

Please sign up or login with your details

Forgot password? Click here to reset