Rethinking Out-of-distribution (OOD) Detection: Masked Image Modeling is All You Need

02/06/2023
by   Jingyao Li, et al.
0

The core of out-of-distribution (OOD) detection is to learn the in-distribution (ID) representation, which is distinguishable from OOD samples. Previous work applied recognition-based methods to learn the ID features, which tend to learn shortcuts instead of comprehensive representations. In this work, we find surprisingly that simply using reconstruction-based methods could boost the performance of OOD detection significantly. We deeply explore the main contributors of OOD detection and find that reconstruction-based pretext tasks have the potential to provide a generally applicable and efficacious prior, which benefits the model in learning intrinsic data distributions of the ID dataset. Specifically, we take Masked Image Modeling as a pretext task for our OOD detection framework (MOOD). Without bells and whistles, MOOD outperforms previous SOTA of one-class OOD detection by 5.7 3.0 10-shot-per-class outlier exposure OOD detection, although we do not include any OOD samples for our detection

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset