Analysis of target data-dependent greedy kernel algorithms: Convergence rates for f-, f · P- and f/P-greedy

05/16/2021
by   Tizian Wenzel, et al.
0

Data-dependent greedy algorithms in kernel spaces are known to provide fast converging interpolants, while being extremely easy to implement and efficient to run. Despite this experimental evidence, no detailed theory has yet been presented. This situation is unsatisfactory especially when compared to the case of the data-independent P-greedy algorithm, for which optimal convergence rates are available, despite its performances being usually inferior to the ones of target data-dependent algorithms. In this work we fill this gap by first defining a new scale of greedy algorithms for interpolation that comprises all the existing ones in a unique analysis, where the degree of dependency of the selection criterion on the functional data is quantified by a real parameter. We then prove new convergence rates where this degree is taken into account and we show that, possibly up to a logarithmic factor, target data-dependent selection strategies provide faster convergence. In particular, for the first time we obtain convergence rates for target data adaptive interpolation that are faster than the ones given by uniform points, without the need of any special assumption on the target function. The rates are confirmed by a number of examples. These results are made possible by a new analysis of greedy algorithms in general Hilbert spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2023

On the optimality of target-data-dependent kernel greedy interpolation in Sobolev Reproducing Kernel Hilbert Spaces

Kernel interpolation is a versatile tool for the approximation of functi...
research
07/28/2022

Adaptive meshfree solution of linear partial differential equations with PDE-greedy kernel methods

We consider the meshless solution of PDEs via symmetric kernel collocati...
research
11/11/2019

A novel class of stabilized greedy kernel approximation algorithms: Convergence, stability uniform point distribution

Kernel based methods provide a way to reconstruct potentially high-dimen...
research
09/29/2021

Greedy algorithms for learning via exponential-polynomial splines

Kernel-based schemes are state-of-the-art techniques for learning by dat...
research
10/16/2021

Greedy and Random Broyden's Methods with Explicit Superlinear Convergence Rates in Nonlinear Equations

In this paper, we propose the greedy and random Broyden's method for sol...
research
05/24/2016

Convergence guarantees for kernel-based quadrature rules in misspecified settings

Kernel-based quadrature rules are becoming important in machine learning...
research
12/01/2019

Anisotropic Functional Deconvolution for the irregular design with dependent long-memory errors

Anisotropic functional deconvolution model is investigated in the bivari...

Please sign up or login with your details

Forgot password? Click here to reset