Weston-Watkins Hinge Loss and Ordered Partitions

06/12/2020
by   Yutong Wang, et al.
0

Multiclass extensions of the support vector machine (SVM) have been formulated in a variety of ways. A recent empirical comparison of nine such formulations [Doǧan et al. 2016] recommends the variant proposed by Weston and Watkins (WW), despite the fact that the WW-hinge loss is not calibrated with respect to the 0-1 loss. In this work we introduce a novel discrete loss function for multiclass classification, the ordered partition loss, and prove that the WW-hinge loss is calibrated with respect to this loss. We also argue that the ordered partition loss is maximally informative among discrete losses satisfying this property. Finally, we apply our theory to justify the empirical observation made by Doǧan et al. that the WW-SVM can work well even under massive label noise, a challenging setting for multiclass SVMs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2021

Improvement over Pinball Loss Support Vector Machine

Recently, there have been several papers that discuss the extension of t...
research
02/09/2021

Learning a powerful SVM using piece-wise linear loss functions

In this paper, we have considered general k-piece-wise linear convex los...
research
05/28/2020

Calibrated Surrogate Losses for Adversarially Robust Classification

Adversarially robust classification seeks a classifier that is insensiti...
research
06/19/2017

Modified Frank-Wolfe Algorithm for Enhanced Sparsity in Support Vector Machine Classifiers

This work proposes a new algorithm for training a re-weighted L2 Support...
research
04/27/2021

Robust Classification via Support Vector Machines

The loss function choice for any Support Vector Machine classifier has r...
research
09/12/2018

But How Does It Work in Theory? Linear SVM with Random Features

We prove that, under low noise assumptions, the support vector machine w...

Please sign up or login with your details

Forgot password? Click here to reset