Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

03/09/2020
by   Pritish Kamath, et al.
0

We present and study approximate notions of dimensional and margin complexity, which correspond to the minimal dimension or norm of an embedding required to approximate, rather then exactly represent, a given hypothesis class. We show that such notions are not only sufficient for learning using linear predictors or a kernel, but unlike the exact variants, are also necessary. Thus they are better suited for discussing limitations of linear or kernel methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset