Self-Supervised Learning Through Efference Copies

10/17/2022
by   Franz Scherr, et al.
0

Self-supervised learning (SSL) methods aim to exploit the abundance of unlabelled data for machine learning (ML), however the underlying principles are often method-specific. An SSL framework derived from biological first principles of embodied learning could unify the various SSL methods, help elucidate learning in the brain, and possibly improve ML. SSL commonly transforms each training datapoint into a pair of views, uses the knowledge of this pairing as a positive (i.e. non-contrastive) self-supervisory sign, and potentially opposes it to unrelated, (i.e. contrastive) negative examples. Here, we show that this type of self-supervision is an incomplete implementation of a concept from neuroscience, the Efference Copy (EC). Specifically, the brain also transforms the environment through efference, i.e. motor commands, however it sends to itself an EC of the full commands, i.e. more than a mere SSL sign. In addition, its action representations are likely egocentric. From such a principled foundation we formally recover and extend SSL methods such as SimCLR, BYOL, and ReLIC under a common theoretical framework, i.e. Self-supervision Through Efference Copies (S-TEC). Empirically, S-TEC restructures meaningfully the within- and between-class representations. This manifests as improvement in recent strong SSL baselines in image classification, segmentation, object detection, and in audio. These results hypothesize a testable positive influence from the brain's motor outputs onto its sensory representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2021

Object-Aware Cropping for Self-Supervised Learning

A core component of the recent success of self-supervised learning is cr...
research
08/23/2022

Efficient Self-Supervision using Patch-based Contrastive Learning for Histopathology Image Segmentation

Learning discriminative representations of unlabelled data is a challeng...
research
02/25/2022

Refining Self-Supervised Learning in Imaging: Beyond Linear Metric

We introduce in this paper a new statistical perspective, exploiting the...
research
08/12/2022

Contrastive Learning for Object Detection

Contrastive learning is commonly used as a method of self-supervised lea...
research
03/17/2023

On the Effects of Self-supervision and Contrastive Alignment in Deep Multi-view Clustering

Self-supervised learning is a central component in recent approaches to ...
research
02/21/2022

A Self-Supervised Descriptor for Image Copy Detection

Image copy detection is an important task for content moderation. We int...
research
11/29/2021

Overcoming the Domain Gap in Contrastive Learning of Neural Action Representations

A fundamental goal in neuroscience is to understand the relationship bet...

Please sign up or login with your details

Forgot password? Click here to reset