Improving out-of-distribution generalization via multi-task self-supervised pretraining

03/30/2020
by   Isabela Albuquerque, et al.
0

Self-supervised feature representations have been shown to be useful for supervised classification, few-shot learning, and adversarial robustness. We show that features obtained using self-supervised learning are comparable to, or better than, supervised learning for domain generalization in computer vision. We introduce a new self-supervised pretext task of predicting responses to Gabor filter banks and demonstrate that multi-task learning of compatible pretext tasks improves domain generalization performance as compared to training individual tasks alone. Features learnt through self-supervision obtain better generalization to unseen domains when compared to their supervised counterpart when there is a larger domain shift between training and test distributions and even show better localization ability for objects of interest. Self-supervised feature representations can also be combined with other domain generalization methods to further boost performance.

READ FULL TEXT

page 5

page 9

page 12

page 13

research
09/05/2023

A Survey of the Impact of Self-Supervised Pretraining for Diagnostic Tasks with Radiological Images

Self-supervised pretraining has been observed to be effective at improvi...
research
12/14/2020

Aggregative Self-Supervised Feature Learning

Self-supervised learning (SSL) is an efficient approach that addresses t...
research
06/22/2020

Don't Wait, Just Weight: Improving Unsupervised Representations by Learning Goal-Driven Instance Weights

In the absence of large labelled datasets, self-supervised learning tech...
research
07/18/2020

Learning from Extrinsic and Intrinsic Supervisions for Domain Generalization

The generalization capability of neural networks across domains is cruci...
research
06/23/2022

Comparing supervised and self-supervised embedding for ExVo Multi-Task learning track

The ICML Expressive Vocalizations (ExVo) Multi-task challenge 2022, focu...
research
06/15/2021

Interpretable Self-supervised Multi-task Learning for COVID-19 Information Retrieval and Extraction

The rapidly evolving literature of COVID-19 related articles makes it ch...
research
11/19/2020

Robot Gaining Accurate Pouring Skills through Self-Supervised Learning and Generalization

Pouring is one of the most commonly executed tasks in humans' daily live...

Please sign up or login with your details

Forgot password? Click here to reset