Learning with Biased Complementary Labels

11/27/2017
by   Xiyu Yu, et al.
0

In this paper we study the classification problem in which we have access to easily obtainable surrogate for the true labels, namely complementary labels, which specify classes that observations do not belong to. For example, if one is familiar with monkeys but not meerkats, a meerkat is easily identified as not a monkey, so "monkey" is annotated to the meerkat as a complementary label. Specifically, let Y and Y̅ be the true and complementary labels, respectively. We first model the annotation of complementary labels via the transition probabilities P(Y̅=i|Y=j), i≠ j∈{1,...,c}, where c is the number of classes. All the previous methods implicitly assume that the transition probabilities P(Y̅=i|Y=j) are identical, which is far from true in practice because humans are biased toward their own experience. For example, if a person is more familiar with monkey than prairie dog when providing complementary labels for meerkats, he/she is more likely to employ "monkey" as a complementary label. We therefore reason that the transition probabilities will be different. In this paper, we address three fundamental problems raised by learning with biased complementary labels. (1) How to estimate the transition probabilities? (2) How to modify the traditional loss functions and extend standard deep neural network classifiers to learn with biased complementary labels? (3) Does the classifier learned from examples with complementary labels by our proposed method converge to the optimal one learned from examples with true labels? Comprehensive experiments on MNIST, CIFAR10, CIFAR100, and Tiny ImageNet empirically validate the superiority of the proposed method to the current state-of-the-art methods with accuracy gains of over 10%.

READ FULL TEXT

page 2

page 3

page 20

research
05/15/2023

CLCIFAR: CIFAR-Derived Benchmark Datasets with Human Annotated Complementary Labels

As a weakly-supervised learning paradigm, complementary label learning (...
research
02/06/2020

Bridging Ordinary-Label Learning and Complementary-Label Learning

Unlike ordinary supervised pattern recognition, in a newly proposed fram...
research
05/22/2017

Learning from Complementary Labels

Collecting labeled data is costly and thus a critical bottleneck in real...
research
06/01/2019

Are Anchor Points Really Indispensable in Label-Noise Learning?

In label-noise learning, noise transition matrix, denoting the probabili...
research
05/15/2023

Enhancing Label Sharing Efficiency in Complementary-Label Learning with Label Augmentation

Complementary-label Learning (CLL) is a form of weakly supervised learni...
research
11/19/2022

Complementary Labels Learning with Augmented Classes

Complementary Labels Learning (CLL) arises in many real-world tasks such...
research
09/20/2022

Reduction from Complementary-Label Learning to Probability Estimates

Complementary-Label Learning (CLL) is a weakly-supervised learning probl...

Please sign up or login with your details

Forgot password? Click here to reset