Boosting in the presence of label noise

09/26/2013
by   Jakramate Bootkrajang, et al.
0

Boosting is known to be sensitive to label noise. We studied two approaches to improve AdaBoost's robustness against labelling errors. One is to employ a label-noise robust classifier as a base learner, while the other is to modify the AdaBoost algorithm to be more robust. Empirical evaluation shows that a committee of robust classifiers, although converges faster than non label-noise aware AdaBoost, is still susceptible to label noise. However, pairing it with the new robust Boosting algorithm we propose here results in a more resilient algorithm under mislabelling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2021

Boosting in the Presence of Massart Noise

We study the problem of boosting the accuracy of a weak learner in the (...
research
06/09/2022

A Resilient Distributed Boosting Algorithm

Given a learning task where the data is distributed among several partie...
research
09/10/2019

Boosting Classifiers with Noisy Inference

We present a principled framework to address resource allocation for rea...
research
01/08/2019

Cost Sensitive Learning in the Presence of Symmetric Label Noise

In binary classification framework, we are interested in making cost sen...
research
06/07/2021

Interactive Label Cleaning with Example-based Explanations

We tackle sequential learning under label noise in applications where a ...
research
03/30/2018

On the Resistance of Neural Nets to Label Noise

We investigate the behavior of convolutional neural networks (CNN) in th...
research
06/11/2019

Fast Rates for a kNN Classifier Robust to Unknown Asymmetric Label Noise

We consider classification in the presence of class-dependent asymmetric...

Please sign up or login with your details

Forgot password? Click here to reset