Size-Independent Sample Complexity of Neural Networks

12/18/2017
by   Noah Golowich, et al.
0

We study the sample complexity of learning neural networks, by providing new bounds on their Rademacher complexity assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth, and under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2023

On Size-Independent Sample Complexity of ReLU Networks

We study the sample complexity of learning ReLU neural networks from the...
research
05/25/2023

Initialization-Dependent Sample Complexity of Linear Predictors and Neural Networks

We provide several new results on the sample complexity of vector-valued...
research
05/09/2019

Data-dependent Sample Complexity of Deep Neural Networks via Lipschitz Augmentation

Existing Rademacher complexity bounds for neural networks rely only on n...
research
03/01/2022

Sample Complexity versus Depth: An Information Theoretic Analysis

Deep learning has proven effective across a range of data sets. In light...
research
03/02/2021

Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks

We consider the problem of finding a two-layer neural network with sigmo...
research
11/05/2018

Generalization Bounds for Neural Networks: Kernels, Symmetry, and Sample Compression

Though Deep Neural Networks (DNNs) are widely celebrated for their pract...
research
02/13/2022

The Sample Complexity of One-Hidden-Layer Neural Networks

We study norm-based uniform convergence bounds for neural networks, aimi...

Please sign up or login with your details

Forgot password? Click here to reset