Noncompact uniform universal approximation

08/07/2023
by   Teun D. H. van Nuland, et al.
0

The universal approximation theorem is generalised to uniform convergence on the (noncompact) input space ℝ^n. All continuous functions that vanish at infinity can be uniformly approximated by neural networks with one hidden layer, for all continuous activation functions φ≠0 with asymptotically linear behaviour at ±∞. When φ is moreover bounded, we exactly determine which functions can be uniformly approximated by neural networks, with the following unexpected results. Let 𝒩_φ^l(ℝ^n) denote the vector space of functions that are uniformly approximable by neural networks with l hidden layers and n inputs. For all n and all l≥2, 𝒩_φ^l(ℝ^n) turns out to be an algebra under the pointwise product. If the left limit of φ differs from its right limit (for instance, when φ is sigmoidal) the algebra 𝒩_φ^l(ℝ^n) (l≥2) is independent of φ and l, and equals the closed span of products of sigmoids composed with one-dimensional projections. If the left limit of φ equals its right limit, 𝒩_φ^l(ℝ^n) (l≥1) equals the (real part of the) commutative resolvent algebra, a C*-algebra which is used in mathematical approaches to quantum theory. In the latter case, the algebra is independent of l≥1, whereas in the former case 𝒩_φ^2(ℝ^n) is strictly bigger than 𝒩_φ^1(ℝ^n).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2022

Extending the Universal Approximation Theorem for a Broad Class of Hypercomplex-Valued Neural Networks

The universal approximation theorem asserts that a single hidden layer n...
research
11/02/2021

Some Questions of Uniformity in Algorithmic Randomness

The Ω numbers-the halting probabilities of universal prefix-free machine...
research
12/20/2019

(Newtonian) Space-Time Algebra

The space-time (s-t) algebra provides a mathematical model for communica...
research
03/30/2021

Uniform Envelopes

In the author's PhD thesis (2019) universal envelopes were introduced as...
research
10/25/2020

Neural Network Approximation: Three Hidden Layers Are Enough

A three-hidden-layer neural network with super approximation power is in...
research
05/18/2023

Clifford Group Equivariant Neural Networks

We introduce Clifford Group Equivariant Neural Networks: a novel approac...
research
08/06/2018

Beyond the Central Limit Theorem: Universal and Non-universal Simulations of Random Variables by General Mappings

The Central Limit Theorem states that a standard Gaussian random variabl...

Please sign up or login with your details

Forgot password? Click here to reset