Toward Understanding the Influence of Individual Clients in Federated Learning

by   Yihao Xue, et al.

Federated learning allows mobile clients to jointly train a global model without sending their private data to a central server. Despite that extensive works have studied the performance guarantee of the global model, it is still unclear how each individual client influences the collaborative training process. In this work, we defined a novel notion, called Fed-Influence, to quantify this influence in terms of model parameter, and proposed an effective and efficient estimation algorithm. In particular, our design satisfies several desirable properties: (1) it requires neither retraining nor retracing, adding only linear computational overhead to clients and the server; (2) it strictly maintains the tenet of federated learning, without revealing any client's local data; and (3) it works well on both convex and non-convex loss functions and does not require the final model to be optimal. Empirical results on a synthetic dataset and the FEMNIST dataset show that our estimation method can approximate Fed-Influence with small bias. Further, we demonstrated an application of client-level model debugging.


page 1

page 2

page 3

page 4


Hierarchical Quantized Federated Learning: Convergence Analysis and System Design

Federated learning is a collaborative machine learning framework to trai...

Towards Fair Federated Learning with Zero-Shot Data Augmentation

Federated learning has emerged as an important distributed learning para...

Achieving Linear Convergence in Federated Learning under Objective and Systems Heterogeneity

We consider a standard federated learning architecture where a group of ...

Communication-Efficient Federated Learning via Optimal Client Sampling

Federated learning is a private and efficient framework for learning mod...

Federated Learning in the Presence of Adversarial Client Unavailability

Federated learning is a decentralized machine learning framework wherein...

Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

We present a theoretical study of server-side optimization in federated ...

IFedAvg: Interpretable Data-Interoperability for Federated Learning

Recently, the ever-growing demand for privacy-oriented machine learning ...

Please sign up or login with your details

Forgot password? Click here to reset