Learning from Multiple Corrupted Sources, with Application to Learning from Label Proportions

10/10/2019
by   Clayton Scott, et al.
0

We study binary classification in the setting where the learner is presented with multiple corrupted training samples, with possibly different sample sizes and degrees of corruption, and introduce an approach based on minimizing a weighted combination of corruption-corrected empirical risks. We establish a generalization error bound, and further show that the bound is optimized when the weights are certain interpretable and intuitive functions of the sample sizes and degrees of corruptions. We then apply this setting to the problem of learning with label proportions (LLP), and propose an algorithm that enjoys the most general statistical performance guarantees known for LLP. Experiments demonstrate the utility of our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2021

Corruption Robust Active Learning

We conduct theoretical studies on streaming-based active learning for bi...
research
10/04/2018

Improved generalization bounds for robust learning

We consider a model of robust learning in an adversarial environment. Th...
research
05/30/2019

Deep multi-class learning from label proportions

We propose a learning algorithm capable of learning from label proportio...
research
01/17/2021

Estimating informativeness of samples with Smooth Unique Information

We define a notion of information that an individual sample provides to ...
research
06/16/2021

Binary classification with corrupted labels

In a binary classification problem where the goal is to fit an accurate ...
research
02/20/2019

Push the Student to Learn Right: Progressive Gradient Correcting by Meta-learner on Corrupted Labels

While deep networks have strong fitting capability to complex input patt...
research
02/14/2023

The Missing Margin: How Sample Corruption Affects Distance to the Boundary in ANNs

Classification margins are commonly used to estimate the generalization ...

Please sign up or login with your details

Forgot password? Click here to reset