Network-size independent covering number bounds for deep networks

11/02/2017
by   Mayank Kabra, et al.
0

We give a covering number bound for deep learning networks that is independent of the size of the network. The key for the simple analysis is that for linear classifiers, rotating the data doesn't affect the covering number. Thus, we can ignore the rotation part of each layer's linear transformation, and get the covering number bound by concentrating on the scaling part.

READ FULL TEXT
research
07/11/2022

Almost optimum ℓ-covering of ℤ_n

A subset B of ring ℤ_n is called a ℓ-covering set if { ab n | 0≤ a ≤ℓ, ...
research
11/08/2018

The biclique covering number of grids

We determine the exact value of the biclique covering number for all gri...
research
05/29/2019

Improved Generalisation Bounds for Deep Learning Through L^∞ Covering Numbers

Using proof techniques involving L^∞ covering numbers, we show generalis...
research
11/02/2022

An Asymptotically Optimal Bound for Covering Arrays of Higher Index

A covering array is an N × k array (N rows, k columns) with each entry f...
research
11/11/2017

Natural exact covering systems and the reversion of the Möbius series

We prove that the number of natural exact covering systems of cardinalit...
research
11/23/2020

Ordinary differential equations (ODE): metric entropy and nonasymptotic theory for noisy function fitting

This paper establishes novel results on the metric entropy of ODE soluti...
research
05/05/2021

Comparative Analysis of Box-Covering Algorithms for Fractal Networks

Research on fractal networks is a dynamically growing field of network s...

Please sign up or login with your details

Forgot password? Click here to reset