SupMAE: Supervised Masked Autoencoders Are Efficient Vision Learners

05/28/2022
by   Feng Liang, et al.
25

Self-supervised Masked Autoencoders (MAE) are emerging as a new pre-training paradigm in computer vision. MAE learns semantics implicitly via reconstructing local patches, requiring thousands of pre-training epochs to achieve favorable performance. This paper incorporates explicit supervision, i.e., golden labels, into the MAE framework. The proposed Supervised MAE (SupMAE) only exploits a visible subset of image patches for classification, unlike the standard supervised pre-training where all image patches are used. SupMAE is efficient and can achieve comparable performance with MAE using only 30 evaluated on ImageNet with the ViT-B/16 model. Detailed ablation studies are conducted to verify the proposed components.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2021

Are Large-scale Datasets Necessary for Self-Supervised Pre-training?

Pre-training models on large scale datasets, like ImageNet, is a standar...
research
03/17/2023

Denoising Diffusion Autoencoders are Unified Self-supervised Learners

Inspired by recent advances in diffusion models, which are reminiscent o...
research
05/28/2022

Object-wise Masked Autoencoders for Fast Pre-training

Self-supervised pre-training for images without labels has recently achi...
research
11/11/2021

Masked Autoencoders Are Scalable Vision Learners

This paper shows that masked autoencoders (MAE) are scalable self-superv...
research
03/23/2022

VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training

Pre-training video transformers on extra large-scale datasets is general...
research
11/17/2022

CAE v2: Context Autoencoder with CLIP Target

Masked image modeling (MIM) learns visual representation by masking and ...
research
08/22/2020

Supervision Levels Scale (SLS)

We propose a three-dimensional discrete and incremental scale to encode ...

Please sign up or login with your details

Forgot password? Click here to reset