A Second-Order Approach to Learning with Instance-Dependent Label Noise

12/22/2020
by   Zhaowei Zhu, et al.
0

The presence of label noise often misleads the training of deep neural networks. Departing from the recent literature which largely assumes the label noise rate is only determined by the true class, the errors in human-annotated labels are more likely to be dependent on the difficulty levels of tasks, resulting in settings with instance-dependent label noise. We show theoretically that the heterogeneous instance-dependent label noise is effectively down-weighting the examples with higher noise rates in a non-uniform way and thus causes imbalances, rendering the strategy of directly applying methods for class-dependent label noise questionable. In this paper, we propose and study the potentials of a second-order approach that leverages the estimation of several covariance terms defined between the instance-dependent noise rates and the Bayes optimal label. We show that this set of second-order information successfully captures the induced imbalances. We further proceed to show that with the help of the estimated second-order information, we identify a new loss function whose expected risk of a classifier under instance-dependent label noise can be shown to be equivalent to a new problem with only class-dependent label noise. This fact allows us to develop effective loss functions to correctly evaluate models. We provide an efficient procedure to perform the estimations without accessing either ground truth labels or prior knowledge of the noise rates. Experiments on CIFAR10 and CIFAR100 with synthetic instance-dependent label noise and Clothing1M with real-world human label noise verify our approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset