Benchmarking Self-Supervised Learning on Diverse Pathology Datasets

12/09/2022
by   Mingu Kang, et al.
0

Computational pathology can lead to saving human lives, but models are annotation hungry and pathology images are notoriously expensive to annotate. Self-supervised learning has shown to be an effective method for utilizing unlabeled data, and its application to pathology could greatly benefit its downstream tasks. Yet, there are no principled studies that compare SSL methods and discuss how to adapt them for pathology. To address this need, we execute the largest-scale study of SSL pre-training on pathology image data, to date. Our study is conducted using 4 representative SSL methods on diverse downstream tasks. We establish that large-scale domain-aligned pre-training in pathology consistently out-performs ImageNet pre-training in standard SSL settings such as linear and fine-tuning evaluations, as well as in low-label regimes. Moreover, we propose a set of domain-specific techniques that we experimentally show leads to a performance boost. Lastly, for the first time, we apply SSL to the challenging task of nuclei instance segmentation and show large and consistent performance improvements under diverse settings.

READ FULL TEXT

page 3

page 4

page 12

page 13

research
09/14/2021

Different Strokes for Different Folks: Investigating Appropriate Further Pre-training Approaches for Diverse Dialogue Tasks

Loading models pre-trained on the large-scale corpus in the general doma...
research
11/16/2022

Region Proposal Network Pre-Training Helps Label-Efficient Object Detection

Self-supervised pre-training, based on the pretext task of instance disc...
research
10/24/2021

Understanding the World Through Action

The recent history of machine learning research has taught us that machi...
research
03/14/2022

Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram

In recent years, self-supervised learning methods have shown significant...
research
05/24/2023

Delving Deeper into Data Scaling in Masked Image Modeling

Understanding whether self-supervised learning methods can scale with un...
research
06/09/2022

On Data Scaling in Masked Image Modeling

An important goal of self-supervised learning is to enable model pre-tra...
research
11/18/2022

Improved Cross-view Completion Pre-training for Stereo Matching

Despite impressive performance for high-level downstream tasks, self-sup...

Please sign up or login with your details

Forgot password? Click here to reset