Collaborative Learning via Prediction Consensus

05/29/2023
by   Dongyang Fan, et al.
0

We consider a collaborative learning setting where each agent's goal is to improve their own model by leveraging the expertise of collaborators, in addition to their own training data. To facilitate the exchange of expertise among agents, we propose a distillation-based method leveraging unlabeled auxiliary data, which is pseudo-labeled by the collective. Central to our method is a trust weighting scheme which serves to adaptively weigh the influence of each collaborator on the pseudo-labels until a consensus on how to label the auxiliary data is reached. We demonstrate that our collaboration scheme is able to significantly boost individual model's performance with respect to the global distribution, compared to local training. At the same time, the adaptive trust weights can effectively identify and mitigate the negative impact of bad models on the collective. We find that our method is particularly effective in the presence of heterogeneity among individual agents, both in terms of training data as well as model architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

Test-time Collective Prediction

An increasingly common setting in machine learning involves multiple par...
research
11/28/2022

Decentralized Learning with Multi-Headed Distillation

Decentralized learning with private data is a central problem in machine...
research
04/06/2020

Trust-based Multiagent Consensus or Weightings Aggregation

We introduce a framework for reaching a consensus amongst several agents...
research
12/05/2019

Collective Learning

In this paper, we introduce the concept of collective learning (CL) whic...
research
05/04/2021

Citadel: Protecting Data Privacy and Model Confidentiality for Collaborative Learning with SGX

With the advancement of machine learning (ML) and its growing awareness,...
research
04/30/2014

Belief Revision and Trust

Belief revision is the process in which an agent incorporates a new piec...
research
07/08/2021

Collaboration of Experts: Achieving 80 100M FLOPs

In this paper, we propose a Collaboration of Experts (CoE) framework to ...

Please sign up or login with your details

Forgot password? Click here to reset