Bootstrapping the Relationship Between Images and Their Clean and Noisy Labels

by   Brandon Smart, et al.

Many state-of-the-art noisy-label learning methods rely on learning mechanisms that estimate the samples' clean labels during training and discard their original noisy labels. However, this approach prevents the learning of the relationship between images, noisy labels and clean labels, which has been shown to be useful when dealing with instance-dependent label noise problems. Furthermore, methods that do aim to learn this relationship require cleanly annotated subsets of data, as well as distillation or multi-faceted models for training. In this paper, we propose a new training algorithm that relies on a simple model to learn the relationship between clean and noisy labels without the need for a cleanly labelled subset of data. Our algorithm follows a 3-stage process, namely: 1) self-supervised pre-training followed by an early-stopping training of the classifier to confidently predict clean labels for a subset of the training set; 2) use the clean set from stage (1) to bootstrap the relationship between images, noisy labels and clean labels, which we exploit for effective relabelling of the remaining training set using semi-supervised learning; and 3) supervised training of the classifier with all relabelled samples from stage (2). By learning this relationship, we achieve state-of-the-art performance in asymmetric and instance-dependent label noise problems.


LongReMix: Robust Learning with High Confidence Samples in a Noisy Label Environment

Deep neural network models are robust to a limited amount of label noise...

Limited Gradient Descent: Learning With Noisy Labels

Label noise may handicap the generalization of classifiers, and it is an...

A Robust Deep Attention Network to Noisy Labels in Semi-supervised Biomedical Segmentation

Learning-based methods suffer from limited clean annotations, especially...

Uncertainty-Aware Bootstrap Learning for Joint Extraction on Distantly-Supervised Data

Jointly extracting entity pairs and their relations is challenging when ...

SplitNet: Learnable Clean-Noisy Label Splitting for Learning with Noisy Labels

Annotating the dataset with high-quality labels is crucial for performan...

Centrality and Consistency: Two-Stage Clean Samples Identification for Learning with Instance-Dependent Noisy Labels

Deep models trained with noisy labels are prone to over-fitting and stru...

Robust Triple-Matrix-Recovery-Based Auto-Weighted Label Propagation for Classification

The graph-based semi-supervised label propagation algorithm has delivere...

Please sign up or login with your details

Forgot password? Click here to reset