Client Selection in Federated Learning based on Gradients Importance
Federated learning (FL) enables multiple devices to collaboratively learn a global model without sharing their personal data. In real-world applications, the different parties are likely to have heterogeneous data distribution and limited communication bandwidth. In this paper, we are interested in improving the communication efficiency of FL systems. We investigate and design a device selection strategy based on the importance of the gradient norms. In particular, our approach consists of selecting devices with the highest norms of gradient values at each communication round. We study the convergence and the performance of such a selection technique and compare it to existing ones. We perform several experiments with non-iid set-up. The results show the convergence of our method with a considerable increase of test accuracy comparing to the random selection.
READ FULL TEXT