APAM: Adaptive Pre-training and Adaptive Meta Learning in Language Model for Noisy Labels and Long-tailed Learning

02/06/2023
by   Sunyi Chi, et al.
0

Practical natural language processing (NLP) tasks are commonly long-tailed with noisy labels. Those problems challenge the generalization and robustness of complex models such as Deep Neural Networks (DNNs). Some commonly used resampling techniques, such as oversampling or undersampling, could easily lead to overfitting. It is growing popular to learn the data weights leveraging a small amount of metadata. Besides, recent studies have shown the advantages of self-supervised pre-training, particularly to the under-represented data. In this work, we propose a general framework to handle the problem of both long-tail and noisy labels. The model is adapted to the domain of problems in a contrastive learning manner. The re-weighting module is a feed-forward network that learns explicit weighting functions and adapts weights according to metadata. The framework further adapts weights of terms in the loss function through a combination of the polynomial expansion of cross-entropy loss and focal loss. Our extensive experiments show that the proposed framework consistently outperforms baseline methods. Lastly, our sensitive analysis emphasizes the capability of the proposed framework to handle the long-tailed problem and mitigate the negative impact of noisy labels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2021

Robust Long-Tailed Learning under Label Noise

Long-tailed learning has attracted much attention recently, with the goa...
research
11/20/2022

Learning from Long-Tailed Noisy Data with Sample Selection and Balanced Loss

The success of deep learning depends on large-scale and well-curated tra...
research
08/25/2021

Learning From Long-Tailed Data With Noisy Labels

Class imbalance and noisy labels are the norm rather than the exception ...
research
12/30/2021

Delving into Sample Loss Curve to Embrace Noisy and Imbalanced Data

Corrupted labels and class imbalance are commonly encountered in practic...
research
12/24/2021

Is Importance Weighting Incompatible with Interpolating Classifiers?

Importance weighting is a classic technique to handle distribution shift...
research
11/22/2022

Dynamic Loss For Robust Learning

Label noise and class imbalance commonly coexist in real-world data. Pre...

Please sign up or login with your details

Forgot password? Click here to reset