Multi-class Label Noise Learning via Loss Decomposition and Centroid Estimation

03/21/2022
by   Yongliang Ding, et al.
0

In real-world scenarios, many large-scale datasets often contain inaccurate labels, i.e., noisy labels, which may confuse model training and lead to performance degradation. To overcome this issue, Label Noise Learning (LNL) has recently attracted much attention, and various methods have been proposed to design an unbiased risk estimator to the noise-free dataset to combat such label noise. Among them, a trend of works based on Loss Decomposition and Centroid Estimation (LDCE) has shown very promising performance. However, existing LNL methods based on LDCE are only designed for binary classification, and they are not directly extendable to multi-class situations. In this paper, we propose a novel multi-class robust learning method for LDCE, which is termed "MC-LDCE". Specifically, we decompose the commonly adopted loss (e.g., mean squared loss) function into a label-dependent part and a label-independent part, in which only the former is influenced by label noise. Further, by defining a new form of data centroid, we transform the recovery problem of a label-dependent part to a centroid estimation problem. Finally, by critically examining the mathematical expectation of clean data centroid given the observed noisy set, the centroid can be estimated which helps to build an unbiased risk estimator for multi-class learning. The proposed MC-LDCE method is general and applicable to different types (i.e., linear and nonlinear) of classification models. The experimental results on five public datasets demonstrate the superiority of the proposed MC-LDCE against other representative LNL methods in tackling multi-class label noise problem.

READ FULL TEXT
research
02/16/2020

Multi-Class Classification from Noisy-Similarity-Labeled Data

A similarity label indicates whether two instances belong to the same cl...
research
02/01/2023

Learning from Stochastic Labels

Annotating multi-class instances is a crucial task in the field of machi...
research
11/24/2022

Lifting Weak Supervision To Structured Prediction

Weak supervision (WS) is a rich set of techniques that produce pseudolab...
research
05/09/2023

FedNoRo: Towards Noise-Robust Federated Learning by Addressing Class Imbalance and Label Noise Heterogeneity

Federated noisy label learning (FNLL) is emerging as a promising tool fo...
research
08/04/2021

Multi-Label Gold Asymmetric Loss Correction with Single-Label Regulators

Multi-label learning is an emerging extension of the multi-class classif...
research
02/25/2023

Complementary to Multiple Labels: A Correlation-Aware Correction Approach

Complementary label learning (CLL) requires annotators to give irrelevan...
research
02/03/2014

Transductive Learning with Multi-class Volume Approximation

Given a hypothesis space, the large volume principle by Vladimir Vapnik ...

Please sign up or login with your details

Forgot password? Click here to reset