No Free Lunch in Self Supervised Representation Learning

04/23/2023
by   Ihab Bendidi, et al.
0

Self-supervised representation learning in computer vision relies heavily on hand-crafted image transformations to learn meaningful and invariant features. However few extensive explorations of the impact of transformation design have been conducted in the literature. In particular, the dependence of downstream performances to transformation design has been established, but not studied in depth. In this work, we explore this relationship, its impact on a domain other than natural images, and show that designing the transformations can be viewed as a form of supervision. First, we demonstrate that not only do transformations have an effect on downstream performance and relevance of clustering, but also that each category in a supervised dataset can be impacted in a different way. Following this, we explore the impact of transformation design on microscopy images, a domain where the difference between classes is more subtle and fuzzy than in natural images. In this case, we observe a greater impact on downstream tasks performances. Finally, we demonstrate that transformation design can be leveraged as a form of supervision, as careful selection of these by a domain expert can lead to a drastic increase in performance on a given downstream task.

READ FULL TEXT

page 1

page 6

page 7

page 8

research
03/23/2021

Self-supervised representation learning from 12-lead ECG data

We put forward a comprehensive assessment of self-supervised representat...
research
02/18/2020

Data Transformation Insights in Self-supervision with Clustering Tasks

Self-supervision is key to extending use of deep learning for label scar...
research
02/20/2021

Self-Supervised Learning via multi-Transformation Classification for Action Recognition

Self-supervised tasks have been utilized to build useful representations...
research
12/14/2021

On the use of Cortical Magnification and Saccades as Biological Proxies for Data Augmentation

Self-supervised learning is a powerful way to learn useful representatio...
research
08/22/2017

Representation Learning by Learning to Count

We introduce a novel method for representation learning that uses an art...
research
11/24/2019

Towards a Hypothesis on Visual Transformation based Self-Supervision

We propose the first qualitative hypothesis characterizing the behavior ...
research
02/04/2015

Learning Local Invariant Mahalanobis Distances

For many tasks and data types, there are natural transformations to whic...

Please sign up or login with your details

Forgot password? Click here to reset