Byzantine-Robust Federated Learning with Optimal Statistical Rates and Privacy Guarantees

05/24/2022
by   Banghua Zhu, et al.
7

We propose Byzantine-robust federated learning protocols with nearly optimal statistical rates. In contrast to prior work, our proposed protocols improve the dimension dependence and achieve a tight statistical rate in terms of all the parameters for strongly convex losses. We benchmark against competing protocols and show the empirical superiority of the proposed protocols. Finally, we remark that our protocols with bucketing can be naturally combined with privacy-guaranteeing procedures to introduce security against a semi-honest server. The code for evaluation is provided in https://github.com/wanglun1996/secure-robust-federated-learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/07/2022

Federated Hetero-Task Learning

To investigate the heterogeneity of federated learning in real-world sce...
07/21/2020

Byzantine-Resilient Secure Federated Learning

Secure federated learning is a privacy-preserving framework to improve m...
06/16/2019

Robust Federated Learning in a Heterogeneous Environment

We study a recently proposed large-scale distributed learning paradigm, ...
02/20/2023

OLYMPIA: A Simulation Framework for Evaluating the Concrete Scalability of Secure Aggregation Protocols

Recent secure aggregation protocols enable privacy-preserving federated ...
03/18/2023

Byzantine-Resilient Federated Learning at Edge

Both Byzantine resilience and communication efficiency have attracted tr...
08/07/2021

The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset

Federated Learning is an algorithm suited for training models on decentr...

Code Repositories

secure-robust-federated-learning

None


view repo

Please sign up or login with your details

Forgot password? Click here to reset