Personalized Federated Learning via Maximizing Correlation with Sparse and Hierarchical Extensions

07/12/2021
by   YinchuanLi, et al.
0

Federated Learning (FL) is a collaborative machine learning technique to train a global model without obtaining clients' private data. The main challenges in FL are statistical diversity among clients, limited computing capability among client equipments and the excessive communication overhead and long latency between server and clients. To address these problems, we propose a novel personalized federated learning via maximizing correlation pFedMac), and further extend it to sparse and hierarchical models. By minimizing loss functions including the properties of an approximated L1-norm and the hierarchical correlation, the performance on statistical diversity data is improved and the communicational and computational loads required in the network are reduced. Theoretical proofs show that pFedMac performs better than the L2-norm distance based personalization methods. Experimentally, we demonstrate the benefits of this sparse hierarchical personalization architecture compared with the state-of-the-art personalization methods and their extensions (e.g. pFedMac achieves 99.75 accuracy on Synthetic under heterogeneous and non-i.i.d data distributions)

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro