All4One: Symbiotic Neighbour Contrastive Learning via Self-Attention and Redundancy Reduction

03/16/2023
by   Imanol G. Estepa, et al.
0

Nearest neighbour based methods have proved to be one of the most successful self-supervised learning (SSL) approaches due to their high generalization capabilities. However, their computational efficiency decreases when more than one neighbour is used. In this paper, we propose a novel contrastive SSL approach, which we call All4One, that reduces the distance between neighbour representations using ”centroids” created through a self-attention mechanism. We use a Centroid Contrasting objective along with single Neighbour Contrasting and Feature Contrasting objectives. Centroids help in learning contextual information from multiple neighbours whereas the neighbour contrast enables learning representations directly from the neighbours and the feature contrast allows learning representations unique to the features. This combination enables All4One to outperform popular instance discrimination approaches by more than 1 and obtains state-of-the-art (SoTA) results. Finally, we show that All4One is robust towards embedding dimensionalities and augmentations, surpassing NNCLR and Barlow Twins by more than 5 settings. The source code would be made available soon.

READ FULL TEXT

page 8

page 14

research
09/10/2021

Attention-based Contrastive Learning for Winograd Schemas

Self-supervised learning has recently attracted considerable attention i...
research
01/26/2021

Revisiting Contrastive Learning for Few-Shot Classification

Instance discrimination based contrastive learning has emerged as a lead...
research
03/27/2022

HELoC: Hierarchical Contrastive Learning of Source Code Representation

Abstract syntax trees (ASTs) play a crucial role in source code represen...
research
02/10/2022

Using Navigational Information to Learn Visual Representations

Children learn to build a visual representation of the world from unsupe...
research
07/20/2021

Group Contrastive Self-Supervised Learning on Graphs

We study self-supervised learning on graphs using contrastive methods. A...
research
10/24/2022

Composition, Attention, or Both?

In this paper, we propose a novel architecture called Composition Attent...
research
03/23/2023

Temperature Schedules for Self-Supervised Contrastive Methods on Long-Tail Data

Most approaches for self-supervised learning (SSL) are optimised on cura...

Please sign up or login with your details

Forgot password? Click here to reset