Client Selection Approach in Support of Clustered Federated Learning over Wireless Edge Networks

by   Abdullatif Albaseer, et al.

Clustered Federated Multitask Learning (CFL) was introduced as an efficient scheme to obtain reliable specialized models when data is imbalanced and distributed in a non-i.i.d. (non-independent and identically distributed) fashion amongst clients. While a similarity measure metric, like the cosine similarity, can be used to endow groups of the client with a specialized model, this process can be arduous as the server should involve all clients in each of the federated learning rounds. Therefore, it is imperative that a subset of clients is selected periodically due to the limited bandwidth and latency constraints at the network edge. To this end, this paper proposes a new client selection algorithm that aims to accelerate the convergence rate for obtaining specialized machine learning models that achieve high test accuracies for all client groups. Specifically, we introduce a client selection approach that leverages the devices' heterogeneity to schedule the clients based on their round latency and exploits the bandwidth reuse for clients that consume more time to update the model. Then, the server performs model averaging and clusters the clients based on predefined thresholds. When a specific cluster reaches a stationary point, the proposed algorithm uses a greedy scheduling algorithm for that group by selecting the clients with less latency to update the model. Extensive experiments show that the proposed approach lowers the training time and accelerates the convergence rate by up to 50 each client with a specialized model that is fit for its local data distribution.


Fair Selection of Edge Nodes to Participate in Clustered Federated Multitask Learning

Clustered federated Multitask learning is introduced as an efficient tec...

FilFL: Accelerating Federated Learning via Client Filtering

Federated learning is an emerging machine learning paradigm that enables...

Fast Federated Edge Learning with Overlapped Communication and Computation and Channel-Aware Fair Client Scheduling

We consider federated edge learning (FEEL) over wireless fading channels...

When Computing Power Network Meets Distributed Machine Learning: An Efficient Federated Split Learning Framework

In this paper, we advocate CPN-FedSL, a novel and flexible Federated Spl...

Heterogeneity-aware Clustered Distributed Learning for Multi-source Data Analysis

In diverse fields ranging from finance to omics, it is increasingly comm...

Revisiting Comparative Performance of DNS Resolvers in the IPv6 and ECS Era

This paper revisits the issue of the performance of DNS resolution servi...

FedSoft: Soft Clustered Federated Learning with Proximal Local Updating

Traditionally, clustered federated learning groups clients with the same...

Please sign up or login with your details

Forgot password? Click here to reset