Tripartite: Tackle Noisy Labels by a More Precise Partition

by   Xuefeng Liang, et al.

Samples in large-scale datasets may be mislabeled due to various reasons, and Deep Neural Networks can easily over-fit to the noisy label data. To tackle this problem, the key point is to alleviate the harm of these noisy labels. Many existing methods try to divide training data into clean and noisy subsets in terms of loss values, and then process the noisy label data varied. One of the reasons hindering a better performance is the hard samples. As hard samples always have relatively large losses whether their labels are clean or noisy, these methods could not divide them precisely. Instead, we propose a Tripartite solution to partition training data more precisely into three subsets: hard, noisy, and clean. The partition criteria are based on the inconsistent predictions of two networks, and the inconsistency between the prediction of a network and the given label. To minimize the harm of noisy labels but maximize the value of noisy label data, we apply a low-weight learning on hard data and a self-supervised learning on noisy label data without using the given labels. Extensive experiments demonstrate that Tripartite can filter out noisy label data more precisely, and outperforms most state-of-the-art methods on five benchmark datasets, especially on real-world datasets.


page 2

page 12


Over-Fit: Noisy-Label Detection based on the Overfitted Model Property

Due to the increasing need to handle the noisy label problem in a massiv...

LaplaceConfidence: a Graph-based Approach for Learning with Noisy Labels

In real-world applications, perfect labels are rarely available, making ...

Learning from Noisy Labels with Coarse-to-Fine Sample Credibility Modeling

Training deep neural network (DNN) with noisy labels is practically chal...

A Topological Filter for Learning with Label Noise

Noisy labels can impair the performance of deep neural networks. To tack...

Ensemble Learning with Manifold-Based Data Splitting for Noisy Label Correction

Label noise in training data can significantly degrade a model's general...

Learning from Noisy Labels with Distillation

The ability of learning from noisy labels is very useful in many visual ...

Robust AUC Optimization under the Supervision of Clean Data

AUC (area under the ROC curve) optimization algorithms have drawn much a...

Please sign up or login with your details

Forgot password? Click here to reset