To Federate or Not To Federate: Incentivizing Client Participation in Federated Learning

by   Yae Jee Cho, et al.

Federated learning (FL) facilitates collaboration between a group of clients who seek to train a common machine learning model without directly sharing their local data. Although there is an abundance of research on improving the speed, efficiency, and accuracy of federated training, most works implicitly assume that all clients are willing to participate in the FL framework. Due to data heterogeneity, however, the global model may not work well for some clients, and they may instead choose to use their own local model. Such disincentivization of clients can be problematic from the server's perspective because having more participating clients yields a better global model, and offers better privacy guarantees to the participating clients. In this paper, we propose an algorithm called IncFL that explicitly maximizes the fraction of clients who are incentivized to use the global model by dynamically adjusting the aggregation weights assigned to their updates. Our experiments show that IncFL increases the number of incentivized clients by 30-55 standard federated training algorithms, and can also improve the generalization performance of the global model on unseen clients.


page 1

page 2

page 3

page 4


Federated Learning as a Network Effects Game

Federated Learning (FL) aims to foster collaboration among a population ...

FedNS: Improving Federated Learning for collaborative image classification on mobile clients

Federated Learning (FL) is a paradigm that aims to support loosely conne...

Heterogeneous Federated Learning via Personalized Generative Networks

Federated Learning (FL) allows several clients to construct a common glo...

FSL: Federated Supermask Learning

Federated learning (FL) allows multiple clients with (private) data to c...

Regulating Clients' Noise Adding in Federated Learning without Verification

In federated learning (FL), clients cooperatively train a global model w...

Delay Sensitive Hierarchical Federated Learning with Stochastic Local Updates

The impact of local averaging on the performance of federated learning (...

Please sign up or login with your details

Forgot password? Click here to reset