Federated Learning in the Presence of Adversarial Client Unavailability

by   Lili Su, et al.

Federated learning is a decentralized machine learning framework wherein not all clients are able to participate in each round. An emerging line of research is devoted to tackling arbitrary client unavailability. Existing theoretical analysis imposes restrictive structural assumptions on the unavailability patterns, and their proposed algorithms were tailored to those assumptions. In this paper, we relax those assumptions and consider adversarial client unavailability. To quantify the degrees of client unavailability, we use the notion of ϵ-adversary dropout fraction. For both non-convex and strongly-convex global objectives, we show that simple variants of FedAvg or FedProx, albeit completely agnostic to ϵ, converge to an estimation error on the order of ϵ (G^2 + σ^2), where G is a heterogeneity parameter and σ^2 is the noise level. We prove that this estimation error is minimax-optimal. We also show that the variants of FedAvg or FedProx have convergence speeds O(1/√(T)) for non-convex objectives and O(1/T) for strongly-convex objectives, both of which are the best possible for any first-order method that only has access to noisy gradients. Our proofs build upon a tight analysis of the selection bias that persists in the entire learning process. We validate our theoretical prediction through numerical experiments on synthetic and real-world datasets.


page 35

page 36


Improved Convergence Rates for Non-Convex Federated Learning with Compression

Federated learning is a new distributed learning paradigm that enables e...

FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

Federated learning is a framework for distributed optimization that plac...

Achieving Linear Convergence in Federated Learning under Objective and Systems Heterogeneity

We consider a standard federated learning architecture where a group of ...

Toward Understanding the Influence of Individual Clients in Federated Learning

Federated learning allows mobile clients to jointly train a global model...

Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

We present a theoretical study of server-side optimization in federated ...

Byzantine-Resilient Federated Learning at Edge

Both Byzantine resilience and communication efficiency have attracted tr...

Incentivizing Honesty among Competitors in Collaborative Learning and Optimization

Collaborative learning techniques have the potential to enable training ...

Please sign up or login with your details

Forgot password? Click here to reset