Self-Supervised Learning by Estimating Twin Class Distributions

10/14/2021
by   Feng Wang, et al.
12

We present TWIST, a novel self-supervised representation learning method by classifying large-scale unlabeled datasets in an end-to-end way. We employ a siamese network terminated by a softmax operation to produce twin class distributions of two augmented images. Without supervision, we enforce the class distributions of different augmentations to be consistent. In the meantime, we regularize the class distributions to make them sharp and diverse. Specifically, we minimize the entropy of the distribution for each sample to make the class prediction for each sample assertive and maximize the entropy of the mean distribution to make the predictions of different samples diverse. In this way, TWIST can naturally avoid the trivial solutions without specific designs such as asymmetric network, stop-gradient operation, or momentum encoder. Different from the clustering-based methods which alternate between clustering and learning, our method is a single learning process guided by a unified loss function. As a result, TWIST outperforms state-of-the-art methods on a wide range of tasks, including unsupervised classification, linear classification, semi-supervised learning, transfer learning, and some dense prediction tasks such as detection and segmentation.

READ FULL TEXT

page 2

page 14

page 15

page 18

page 19

page 20

research
01/13/2020

Semi-supervised learning method based on predefined evenly-distributed class centroids

Compared to supervised learning, semi-supervised learning reduces the de...
research
09/17/2020

MoPro: Webly Supervised Learning with Momentum Prototypes

We propose a webly-supervised representation learning method that does n...
research
05/16/2021

Semi-supervised Contrastive Learning with Similarity Co-calibration

Semi-supervised learning acts as an effective way to leverage massive un...
research
09/05/2023

Probabilistic Self-supervised Learning via Scoring Rules Minimization

In this paper, we propose a novel probabilistic self-supervised learning...
research
03/19/2021

Self-Supervised Classification Network

We present Self-Classifier – a novel self-supervised end-to-end classifi...
research
03/16/2022

Relational Self-Supervised Learning

Self-supervised Learning (SSL) including the mainstream contrastive lear...
research
08/24/2020

Learning Kernel for Conditional Moment-Matching Discrepancy-based Image Classification

Conditional Maximum Mean Discrepancy (CMMD) can capture the discrepancy ...

Please sign up or login with your details

Forgot password? Click here to reset