A Gradient-based Approach for Online Robust Deep Neural Network Training with Noisy Labels

06/08/2023
by   Yifan Yang, et al.
0

Learning with noisy labels is an important topic for scalable training in many real-world scenarios. However, few previous research considers this problem in the online setting, where the arrival of data is streaming. In this paper, we propose a novel gradient-based approach to enable the detection of noisy labels for the online learning of model parameters, named Online Gradient-based Robust Selection (OGRS). In contrast to the previous sample selection approach for the offline training that requires the estimation of a clean ratio of the dataset before each epoch of training, OGRS can automatically select clean samples by steps of gradient update from datasets with varying clean ratios without changing the parameter setting. During the training process, the OGRS method selects clean samples at each iteration and feeds the selected sample to incrementally update the model parameters. We provide a detailed theoretical analysis to demonstrate data selection process is converging to the low-loss region of the sample space, by introducing and proving the sub-linear local Lagrangian regret of the non-convex constrained optimization problem. Experimental results show that it outperforms state-of-the-art methods in different settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2023

Knockoffs-SPR: Clean Sample Selection in Learning with Noisy Labels

A noisy training set usually leads to the degradation of the generalizat...
research
03/06/2020

No Regret Sample Selection with Noisy Labels

Deep Neural Network (DNN) suffers from noisy labeled data because of the...
research
06/07/2023

Learning with Noisy Labels by Adaptive Gradient-Based Outlier Removal

An accurate and substantial dataset is necessary to train a reliable and...
research
11/18/2021

Training Neural Networks with Fixed Sparse Masks

During typical gradient-based training of deep neural networks, all of t...
research
11/19/2015

Online Batch Selection for Faster Training of Neural Networks

Deep neural networks are commonly trained using stochastic non-convex op...
research
03/24/2021

Jo-SRC: A Contrastive Approach for Combating Noisy Labels

Due to the memorization effect in Deep Neural Networks (DNNs), training ...
research
02/26/2020

PrIU: A Provenance-Based Approach for Incrementally Updating Regression Models

The ubiquitous use of machine learning algorithms brings new challenges ...

Please sign up or login with your details

Forgot password? Click here to reset