Depth-Width Trade-offs for Neural Networks via Topological Entropy

10/15/2020
by   Kaifeng Bu, et al.
0

One of the central problems in the study of deep learning theory is to understand how the structure properties, such as depth, width and the number of nodes, affect the expressivity of deep neural networks. In this work, we show a new connection between the expressivity of deep neural networks and topological entropy from dynamical system, which can be used to characterize depth-width trade-offs of neural networks. We provide an upper bound on the topological entropy of neural networks with continuous semi-algebraic units by the structure parameters. Specifically, the topological entropy of ReLU network with l layers and m nodes per layer is upper bounded by O(llog m). Besides, if the neural network is a good approximation of some function f, then the size of the neural network has an exponential lower bound with respect to the topological entropy of f. Moreover, we discuss the relationship between topological entropy, the number of oscillations, periods and Lipschitz constant.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2020

Better Depth-Width Trade-offs for Neural Networks through the lens of Dynamical Systems

The expressivity of neural networks as a function of their depth, width ...
research
05/25/2022

Entropy Maximization with Depth: A Variational Principle for Random Neural Networks

To understand the essential role of depth in neural networks, we investi...
research
12/09/2019

Depth-Width Trade-offs for ReLU Networks via Sharkovsky's Theorem

Understanding the representational power of Deep Neural Networks (DNNs) ...
research
05/25/2023

Data Topology-Dependent Upper Bounds of Neural Network Widths

This paper investigates the relationship between the universal approxima...
research
04/03/2018

Analysis on the Nonlinear Dynamics of Deep Neural Networks: Topological Entropy and Chaos

The theoretical explanation for deep neural network (DNN) is still an op...
research
01/28/2021

Information contraction in noisy binary neural networks and its implications

Neural networks have gained importance as the machine learning models th...
research
10/19/2021

Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem

Given a target function f, how large must a neural network be in order t...

Please sign up or login with your details

Forgot password? Click here to reset