A Decentralized Collaborative Learning Framework Across Heterogeneous Devices for Personalized Predictive Analytics

05/27/2022
by   Guanhua Ye, et al.
0

In this paper, we propose a Similarity-based Decentralized Knowledge Distillation (SD-Dist) framework for collaboratively learning heterogeneous deep models on decentralized devices. By introducing a preloaded reference dataset, SD-Dist enables all participant devices to identify similar users and distil knowledge from them without any assumptions on a fixed model architecture. In addition, none of these operations will reveal any sensitive information like personal data and model parameters. Extensive experimental results on three real-life datasets show that SD-Dist can achieve competitive performance with less compute resources, while ensuring model heterogeneity and privacy. As revealed in our experiments, our framework also enhances the resultant models' robustness when users' data is sparse and diverse.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset