Scaff-PD: Communication Efficient Fair and Robust Federated Learning

07/25/2023
by   Yaodong Yu, et al.
0

We present Scaff-PD, a fast and communication-efficient algorithm for distributionally robust federated learning. Our approach improves fairness by optimizing a family of distributionally robust objectives tailored to heterogeneous clients. We leverage the special structure of these objectives, and design an accelerated primal dual (APD) algorithm which uses bias corrected local steps (as in Scaffold) to achieve significant gains in communication efficiency and convergence speed. We evaluate Scaff-PD on several benchmark datasets and demonstrate its effectiveness in improving fairness and robustness while maintaining competitive accuracy. Our results suggest that Scaff-PD is a promising approach for federated learning in resource-constrained and heterogeneous settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/25/2019

Fair Resource Allocation in Federated Learning

Federated learning involves training statistical models in massive, hete...
10/14/2022

A Primal-Dual Algorithm for Hybrid Federated Learning

Very few methods for hybrid federated learning, where clients only hold ...
12/17/2021

Federated Learning with Heterogeneous Data: A Superquantile Optimization Approach

We present a federated learning framework that is designed to robustly d...
06/15/2022

Clustered Scheduling and Communication Pipelining For Efficient Resource Management Of Wireless Federated Learning

This paper proposes using communication pipelining to enhance the wirele...
06/04/2023

Resilient Constrained Learning

When deploying machine learning solutions, they must satisfy multiple re...
04/22/2020

Hierarchically Fair Federated Learning

Federated learning facilitates collaboration among self-interested agent...
09/17/2021

Achieving Model Fairness in Vertical Federated Learning

Vertical federated learning (VFL), which enables multiple enterprises po...

Please sign up or login with your details

Forgot password? Click here to reset