Multiclass learning with margin: exponential rates with no bias-variance trade-off

by   Stefano Vigogna, et al.

We study the behavior of error bounds for multiclass classification under suitable margin conditions. For a wide variety of methods we prove that the classification error under a hard-margin condition decreases exponentially fast without any bias-variance trade-off. Different convergence rates can be obtained in correspondence of different margin assumptions. With a self-contained and instructive analysis we are able to generalize known results from the binary to the multiclass setting.


page 1

page 2

page 3

page 4


A Case of Exponential Convergence Rates for SVM

Classification is often the first problem described in introductory mach...

Fast classification rates without standard margin assumptions

We consider the classical problem of learning rates for classes with fin...

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

Given a matrix A, a linear feasibility problem (of which linear classifi...

Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

We consider the optimization of a quadratic objective function whose gra...

Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

We consider learning methods based on the regularization of a convex emp...

Is margin preserved after random projection?

Random projections have been applied in many machine learning algorithms...

Bias-Variance Decompositions for Margin Losses

We introduce a novel bias-variance decomposition for a range of strictly...

Please sign up or login with your details

Forgot password? Click here to reset