Metric Entropy Limits on Recurrent Neural Network Learning of Linear Dynamical Systems

05/06/2021
by   Clemens Hutter, et al.
0

One of the most influential results in neural network theory is the universal approximation theorem [1, 2, 3] which states that continuous functions can be approximated to within arbitrary accuracy by single-hidden-layer feedforward neural networks. The purpose of this paper is to establish a result in this spirit for the approximation of general discrete-time linear dynamical systems - including time-varying systems - by recurrent neural networks (RNNs). For the subclass of linear time-invariant (LTI) systems, we devise a quantitative version of this statement. Specifically, measuring the complexity of the considered class of LTI systems through metric entropy according to [4], we show that RNNs can optimally learn - or identify in system-theory parlance - stable LTI systems. For LTI systems whose input-output relation is characterized through a difference equation, this means that RNNs can learn the difference equation from input-output traces in a metric-entropy optimal manner.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2022

Metric entropy of causal, discrete-time LTI systems

In [1] it is shown that recurrent neural networks (RNNs) can learn - in ...
research
10/30/2019

Input-Output Equivalence of Unitary and Contractive RNNs

Unitary recurrent neural networks (URNNs) have been proposed as a method...
research
06/21/2021

Learn Like The Pro: Norms from Theory to Size Neural Computation

The optimal design of neural networks is a critical problem in many appl...
research
02/14/2020

Approximation Bounds for Random Neural Networks and Reservoir Systems

This work studies approximation based on single-hidden-layer feedforward...
research
05/14/2020

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

Echo State Networks (ESNs) are a class of single-layer recurrent neural ...
research
06/16/2023

Beyond Geometry: Comparing the Temporal Structure of Computation in Neural Circuits with Dynamical Similarity Analysis

How can we tell whether two neural networks are utilizing the same inter...
research
09/16/2020

On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis

We study the approximation properties and optimization dynamics of recur...

Please sign up or login with your details

Forgot password? Click here to reset