Learning from Data with Noisy Labels Using Temporal Self-Ensemble

07/21/2022
by   Jun Ho Lee, et al.
0

There are inevitably many mislabeled data in real-world datasets. Because deep neural networks (DNNs) have an enormous capacity to memorize noisy labels, a robust training scheme is required to prevent labeling errors from degrading the generalization performance of DNNs. Current state-of-the-art methods present a co-training scheme that trains dual networks using samples associated with small losses. In practice, however, training two networks simultaneously can burden computing resources. In this study, we propose a simple yet effective robust training scheme that operates by training only a single network. During training, the proposed method generates temporal self-ensemble by sampling intermediate network parameters from the weight trajectory formed by stochastic gradient descent optimization. The loss sum evaluated with these self-ensembles is used to identify incorrectly labeled samples. In parallel, our method generates multi-view predictions by transforming an input data into various forms and considers their agreement to identify incorrectly labeled samples. By combining the aforementioned metrics, we present the proposed self-ensemble-based robust training (SRT) method, which can filter the samples with noisy labels to reduce their influence on training. Experiments on widely-used public datasets demonstrate that the proposed method achieves a state-of-the-art performance in some categories without training the dual networks.

READ FULL TEXT
research
09/20/2019

A Simple yet Effective Baseline for Robust Deep Learning with Noisy Labels

Recently deep neural networks have shown their capacity to memorize trai...
research
10/04/2019

SELF: Learning to Filter Noisy Labels with Self-Ensembling

Deep neural networks (DNNs) have been shown to over-fit a dataset when b...
research
03/25/2021

Transform consistency for learning with noisy labels

It is crucial to distinguish mislabeled samples for dealing with noisy l...
research
12/02/2021

Sample Prior Guided Robust Model Learning to Suppress Noisy Labels

Imperfect labels are ubiquitous in real-world datasets and seriously har...
research
10/22/2019

Robust Training with Ensemble Consensus

Since deep neural networks are over-parametrized, they may memorize nois...
research
06/13/2019

A Meta Approach to Defend Noisy Labels by the Manifold Regularizer PSDR

Noisy labels are ubiquitous in real-world datasets, which poses a challe...
research
08/11/2021

Cooperative Learning for Noisy Supervision

Learning with noisy labels has gained the enormous interest in the robus...

Please sign up or login with your details

Forgot password? Click here to reset