DeepAI AI Chat
Log In Sign Up

A Flexible Class of Dependence-aware Multi-Label Loss Functions

by   Eyke Hüllermeier, et al.

Multi-label classification is the task of assigning a subset of labels to a given query instance. For evaluating such predictions, the set of predicted labels needs to be compared to the ground-truth label set associated with that instance, and various loss functions have been proposed for this purpose. In addition to assessing predictive accuracy, a key concern in this regard is to foster and to analyze a learner's ability to capture label dependencies. In this paper, we introduce a new class of loss functions for multi-label classification, which overcome disadvantages of commonly used losses such as Hamming and subset 0/1. To this end, we leverage the mathematical framework of non-additive measures and integrals. Roughly speaking, a non-additive measure allows for modeling the importance of correct predictions of label subsets (instead of single labels), and thereby their impact on the overall evaluation, in a flexible way - by giving full importance to single labels and the entire label set, respectively, Hamming and subset 0/1 are rather extreme in this regard. We present concrete instantiations of this class, which comprise Hamming and subset 0/1 as special cases, and which appear to be especially appealing from a modeling perspective. The assessment of multi-label classifiers in terms of these losses is illustrated in an empirical study.


page 1

page 2

page 3

page 4


Learning Gradient Boosted Multi-label Classification Rules

In multi-label classification, where the evaluation of predictions is le...

Reliable Multi-label Classification: Prediction with Partial Abstention

In contrast to conventional (single-label) classification, the setting o...

A Cross-Conformal Predictor for Multi-label Classification

Unlike the typical classification setting where each instance is associa...

Multi-label classification: do Hamming loss and subset accuracy really conflict with each other?

Various evaluation measures have been developed for multi-label classifi...

CCMN: A General Framework for Learning with Class-Conditional Multi-Label Noise

Class-conditional noise commonly exists in machine learning tasks, where...

Unbiased Loss Functions for Extreme Classification With Missing Labels

The goal in extreme multi-label classification (XMC) is to tag an instan...

On the benefits of output sparsity for multi-label classification

The multi-label classification framework, where each observation can be ...