A Curriculum View of Robust Loss Functions

by   Zebin Ou, et al.

Robust loss functions are designed to combat the adverse impacts of label noise, whose robustness is typically supported by theoretical bounds agnostic to the training dynamics. However, these bounds may fail to characterize the empirical performance as it remains unclear why robust loss functions can underfit. We show that most loss functions can be rewritten into a form with the same class-score margin and different sample-weighting functions. The resulting curriculum view provides a straightforward analysis of the training dynamics, which helps attribute underfitting to diminished average sample weights and noise robustness to larger weights for clean samples. We show that simple fixes to the curriculums can make underfitting robust loss functions competitive with the state-of-the-art, and training schedules can substantially affect the noise robustness even with robust loss functions. Code is available at <github>.


page 1

page 2

page 3

page 4


Asymmetric Loss Functions for Learning with Noisy Labels

Robust loss functions are essential for training deep neural networks wi...

Searching for Robustness: Loss Learning for Noisy Classification Tasks

We present a "learning to learn" approach for automatically constructing...

On the Robustness of Counterfactual Explanations to Adverse Perturbations

Counterfactual explanations (CEs) are a powerful means for understanding...

On the Learning Dynamics of Attention Networks

Attention models are typically learned by optimizing one of three standa...

On Testing Equal Conditional Predictive Ability Under Measurement Error

Loss functions are widely used to compare several competing forecasts. H...

Label Noise: Correcting a Correction

Training neural network classifiers on datasets with label noise poses a...

Score-oriented loss (SOL) functions

Loss functions engineering and the assessment of forecasting performance...

Please sign up or login with your details

Forgot password? Click here to reset