Equitable-FL: Federated Learning with Sparsity for Resource-Constrained Environment

09/02/2023
by   Indrajeet Kumar Sinha, et al.
0

In Federated Learning, model training is performed across multiple computing devices, where only parameters are shared with a common central server without exchanging their data instances. This strategy assumes abundance of resources on individual clients and utilizes these resources to build a richer model as user's models. However, when the assumption of the abundance of resources is violated, learning may not be possible as some nodes may not be able to participate in the process. In this paper, we propose a sparse form of federated learning that performs well in a Resource Constrained Environment. Our goal is to make learning possible, regardless of a node's space, computing, or bandwidth scarcity. The method is based on the observation that model size viz a viz available resources defines resource scarcity, which entails that reduction of the number of parameters without affecting accuracy is key to model training in a resource-constrained environment. In this work, the Lottery Ticket Hypothesis approach is utilized to progressively sparsify models to encourage nodes with resource scarcity to participate in collaborative training. We validate Equitable-FL on the MNIST, F-MNIST, and CIFAR-10 benchmark datasets, as well as the Brain-MRI data and the PlantVillage datasets. Further, we examine the effect of sparsity on performance, model size compaction, and speed-up for training. Results obtained from experiments performed for training convolutional neural networks validate the efficacy of Equitable-FL in heterogeneous resource-constrained learning environment.

READ FULL TEXT
research
04/08/2022

Resource Consumption for Supporting Federated Learning in Wireless Networks

Federated learning (FL) has recently become one of the hottest focuses i...
research
03/26/2021

Prior-Independent Auctions for the Demand Side of Federated Learning

Federated learning (FL) is a paradigm that allows distributed clients to...
research
09/19/2023

Toward efficient resource utilization at edge nodes in federated learning

Federated learning (FL) enables edge nodes to collaboratively contribute...
research
12/18/2021

Federated Dynamic Sparse Training: Computing Less, Communicating Less, Yet Learning Better

Federated learning (FL) enables distribution of machine learning workloa...
research
08/15/2023

NeFL: Nested Federated Learning for Heterogeneous Clients

Federated learning (FL) is a promising approach in distributed learning ...
research
09/03/2023

FedFwd: Federated Learning without Backpropagation

In federated learning (FL), clients with limited resources can disrupt t...
research
08/24/2021

Data-Free Evaluation of User Contributions in Federated Learning

Federated learning (FL) trains a machine learning model on mobile device...

Please sign up or login with your details

Forgot password? Click here to reset