`local' vs. `global' parameters -- breaking the gaussian complexity barrier
We show that if F is a convex class of functions that is L-subgaussian, the error rate of learning problems generated by independent noise is equivalent to a fixed point determined by `local' covering estimates of the class, rather than by the gaussian averages. To that end, we establish new sharp upper and lower estimates on the error rate for such problems.
READ FULL TEXT 
  
  
     share
 share