Neural Networks with Small Weights and Depth-Separation Barriers

05/31/2020
by   Gal Vardi, et al.
0

In studying the expressiveness of neural networks, an important question is whether there are functions which can only be approximated by sufficiently deep networks, assuming their size is bounded. However, for constant depths, existing results are limited to depths 2 and 3, and achieving results for higher depths has been an important open question. In this paper, we focus on feedforward ReLU networks, and prove fundamental barriers to proving such results beyond depth 4, by reduction to open problems and natural-proof barriers in circuit complexity. To show this, we study a seemingly unrelated problem of independent interest: Namely, whether there are polynomially-bounded functions which require super-polynomial weights in order to approximate with constant-depth neural networks. We provide a negative and constructive answer to that question, by showing that if a function can be approximated by a polynomially-sized, constant depth k network with arbitrarily large weights, it can also be approximated by a polynomially-sized, depth 3k+3 network, whose weights are polynomially bounded.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2021

Size and Depth Separation in Approximating Natural Functions with Neural Networks

When studying the expressive power of neural networks, a main challenge ...
research
02/27/2017

Depth Separation for Neural Networks

Let f:S^d-1×S^d-1→S be a function of the form f(x,x') = g(〈x,x'〉) for g:...
research
04/15/2019

Depth Separations in Neural Networks: What is Actually Being Separated?

Existing depth separation results for constant-depth networks essentiall...
research
01/18/2021

A simple geometric proof for the benefit of depth in ReLU networks

We present a simple proof for the benefit of depth in multi-layer feedfo...
research
07/15/2022

Exact Flow Sparsification Requires Unbounded Size

Given a large edge-capacitated network G and a subset of k vertices call...
research
08/05/2022

Why do networks have inhibitory/negative connections?

Why do brains have inhibitory connections? Why do deep networks have neg...
research
04/06/2021

Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces

We study the computational complexity of (deterministic or randomized) a...

Please sign up or login with your details

Forgot password? Click here to reset