Maximising the Utility of Validation Sets for Imbalanced Noisy-label Meta-learning

08/17/2022
by   Dung Anh Hoang, et al.
0

Meta-learning is an effective method to handle imbalanced and noisy-label learning, but it depends on a validation set containing randomly selected, manually labelled and balanced distributed samples. The random selection and manual labelling and balancing of this validation set is not only sub-optimal for meta-learning, but it also scales poorly with the number of classes. Hence, recent meta-learning papers have proposed ad-hoc heuristics to automatically build and label this validation set, but these heuristics are still sub-optimal for meta-learning. In this paper, we analyse the meta-learning algorithm and propose new criteria to characterise the utility of the validation set, based on: 1) the informativeness of the validation set; 2) the class distribution balance of the set; and 3) the correctness of the labels of the set. Furthermore, we propose a new imbalanced noisy-label meta-learning (INOLML) algorithm that automatically builds a validation set by maximising its utility using the criteria above. Our method shows significant improvements over previous meta-learning approaches and sets the new state-of-the-art on several benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2019

Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks

While tasks could come with varying number of instances in realistic set...
research
06/16/2018

Meta-learning: searching in the model space

There is no free lunch, no single learning algorithm that will outperfor...
research
12/28/2022

Learning to Detect Noisy Labels Using Model-Based Features

Label noise is ubiquitous in various machine learning scenarios such as ...
research
01/04/2023

Task Weighting in Meta-learning with Trajectory Optimisation

Developing meta-learning algorithms that are un-biased toward a subset o...
research
08/27/2019

MetaMixUp: Learning Adaptive Interpolation Policy of MixUp with Meta-Learning

MixUp is an effective data augmentation method to regularize deep neural...
research
07/08/2020

Meta-Learning One-Class Classification with DeepSets: Application in the Milky Way

We explore in this paper the use of neural networks designed for point-c...
research
12/05/2020

A Survey on Deep Learning with Noisy Labels: How to train your model when you cannot trust on the annotations?

Noisy Labels are commonly present in data sets automatically collected f...

Please sign up or login with your details

Forgot password? Click here to reset