Fast classification rates without standard margin assumptions

10/28/2019
by   Olivier Bousquet, et al.
0

We consider the classical problem of learning rates for classes with finite VC dimension. It is well known that fast learning rates are achievable by the empirical risk minimization algorithm (ERM) if one of the low noise/margin assumptions such as Tsybakov's and Massart's condition is satisfied. In this paper, we consider an alternative way of obtaining fast learning rates in classification if none of these conditions are met. We first consider Chow's reject option model and show that by lowering the impact of a small fraction of hard instances, fast learning rate is achievable in an agnostic model by a specific learning algorithm. Similar results were only known under special versions of margin assumptions. We also show that the learning algorithm achieving these rates is adaptive to standard margin assumptions and always satisfies the risk bounds achieved by ERM. Based on our results on Chow's model, we then analyze a particular family of VC classes, namely classes with finite combinatorial diameter. Using their special structure, we show that there is an improper learning algorithm that provides fast rates of convergence even in the (poorly understood) situations where ERM is suboptimal. This provides the first setup in which an improper learning algorithm may significantly improve the learning rates for non-convex losses. Finally, we discuss some implications of our techniques to the analysis of ERM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2022

Multiclass learning with margin: exponential rates with no bias-variance trade-off

We study the behavior of error bounds for multiclass classification unde...
research
07/09/2015

Fast rates in statistical and online learning

The speed with which a learning algorithm converges as it is presented w...
research
02/19/2020

A Unified Convergence Analysis for Shuffling-Type Gradient Methods

In this paper, we provide a unified convergence analysis for a class of ...
research
10/05/2022

Multiclass Learnability Beyond the PAC Framework: Universal Rates and Partial Concept Classes

In this paper we study the problem of multiclass classification with a b...
research
09/29/2016

Fast learning rates with heavy-tailed losses

We study fast learning rates when the losses are not necessarily bounded...
research
06/11/2019

Fast Rates for a kNN Classifier Robust to Unknown Asymmetric Label Noise

We consider classification in the presence of class-dependent asymmetric...
research
06/16/2022

On Error and Compression Rates for Prototype Rules

We study the close interplay between error and compression in the non-pa...

Please sign up or login with your details

Forgot password? Click here to reset