Semi-Supervised Learning with Scarce Annotations

05/21/2019
by   Sylvestre-Alvise Rebuffi, et al.
3

While semi-supervised learning (SSL) algorithms provide an efficient way to make use of both labelled and unlabelled data, they generally struggle when the number of annotated samples is very small. In this work, we consider the problem of SSL multi-class classification with very few labelled instances. We introduce two key ideas. The first is a simple but effective one: we leverage the power of transfer learning among different tasks and self-supervision to initialize a good representation of the data without making use of any label. The second idea is a new algorithm for SSL that can exploit well such a pre-trained representation. The algorithm works by alternating two phases, one fitting the labelled points and one fitting the unlabelled ones, with carefully-controlled information flow between them. The benefits are greatly reducing overfitting of the labelled data and avoiding issue with balancing labelled and unlabelled losses during training. We show empirically that this method can successfully train competitive models with as few as 10 labelled data points per class. More in general, we show that the idea of bootstrapping features using self-supervised learning always improves SSL on standard benchmarks. We show that our algorithm works increasingly well compared to other methods when refining from other tasks or datasets.

READ FULL TEXT
research
03/05/2021

Self-supervised Mean Teacher for Semi-supervised Chest X-ray Classification

The training of deep learning models generally requires a large amount o...
research
06/19/2023

Semi-Supervised Learning for hyperspectral images by non parametrically predicting view assignment

Hyperspectral image (HSI) classification is gaining a lot of momentum in...
research
02/13/2020

Automatically Discovering and Learning New Visual Categories with Ranking Statistics

We tackle the problem of discovering novel classes in an image collectio...
research
11/27/2022

Impact of Labelled Set Selection and Supervision Policies on Semi-supervised Learning

In semi-supervised representation learning frameworks, when the number o...
research
06/06/2023

PILLAR: How to make semi-private learning more effective

In Semi-Supervised Semi-Private (SP) learning, the learner has access to...
research
05/22/2018

Semi-supervised learning: When and why it works

Semi-supervised learning deals with the problem of how, if possible, to ...
research
06/14/2020

MixMOOD: A systematic approach to class distribution mismatch in semi-supervised learning using deep dataset dissimilarity measures

In this work, we propose MixMOOD - a systematic approach to mitigate eff...

Please sign up or login with your details

Forgot password? Click here to reset