Optimal Stable Nonlinear Approximation

09/21/2020
by   Albert Cohen, et al.
0

While it is well known that nonlinear methods of approximation can often perform dramatically better than linear methods, there are still questions on how to measure the optimal performance possible for such methods. This paper studies nonlinear methods of approximation that are compatible with numerical implementation in that they are required to be numerically stable. A measure of optimal performance, called stable manifold widths, for approximating a model class K in a Banach space X by stable manifold methods is introduced. Fundamental inequalities between these stable manifold widths and the entropy of K are established. The effects of requiring stability in the settings of deep learning and compressed sensing are discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2021

Lipschitz widths

This paper introduces a measure, called Lipschitz widths, of the optimal...
research
10/11/2019

Zap Q-Learning With Nonlinear Function Approximation

The Zap stochastic approximation (SA) algorithm was introduced recently ...
research
07/17/2018

On Recovery Guarantees for One-Bit Compressed Sensing on Manifolds

This paper studies the problem of recovering a signal from one-bit compr...
research
10/13/2021

Learning Stable Koopman Embeddings

In this paper, we present a new data-driven method for learning stable m...
research
10/15/2021

Learning the Koopman Eigendecomposition: A Diffeomorphic Approach

We present a novel data-driven approach for learning linear representati...
research
12/01/2020

Kernel methods for center manifold approximation and a data-based version of the Center Manifold Theorem

For dynamical systems with a non hyperbolic equilibrium, it is possible ...
research
09/19/2022

Nonlinear approximation spaces for inverse problems

This paper is concerned with the ubiquitous inverse problem of recoverin...

Please sign up or login with your details

Forgot password? Click here to reset