Stability of convergence rates: Kernel interpolation on non-Lipschitz domains

03/23/2022
by   Tizian Wenzel, et al.
0

Error estimates for kernel interpolation in Reproducing Kernel Hilbert Spaces (RKHS) usually assume quite restrictive properties on the shape of the domain, especially in the case of infinitely smooth kernels like the popular Gaussian kernel. In this paper we leverage an analysis of greedy kernel algorithms to prove that it is possible to obtain convergence results (in the number of interpolation points) for kernel interpolation for arbitrary domains Ω⊂ℝ^d, thus allowing for non-Lipschitz domains including e.g. cusps and irregular boundaries. Especially we show that, when going to a smaller domain Ω̃⊂Ω⊂ℝ^d, the convergence rate does not deteriorate - i.e. the convergence rates are stable with respect to going to a subset. The impact of this result is explained on the examples of kernels of finite as well as infinite smoothness like the Gaussian kernel. A comparison to approximation in Sobolev spaces is drawn, where the shape of the domain Ω has an impact on the approximation properties. Numerical experiments illustrate and confirm the experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2023

On the optimality of target-data-dependent kernel greedy interpolation in Sobolev Reproducing Kernel Hilbert Spaces

Kernel interpolation is a versatile tool for the approximation of functi...
research
08/11/2023

Doubling the rate – improved error bounds for orthogonal projection in Hilbert spaces

Convergence rates for L_2 approximation in a Hilbert space H are a centr...
research
12/15/2022

Interpolation with the polynomial kernels

The polynomial kernels are widely used in machine learning and they are ...
research
03/10/2022

Asymptotic Bounds for Smoothness Parameter Estimates in Gaussian Process Interpolation

It is common to model a deterministic response function, such as the out...
research
05/24/2016

Convergence guarantees for kernel-based quadrature rules in misspecified settings

Kernel-based quadrature rules are becoming important in machine learning...
research
08/28/2020

Uniformly bounded Lebesgue constants for scaled cardinal interpolation with Matérn kernels

For h>0 and positive integers m, d, such that m>d/2, we study non-statio...
research
06/07/2021

Parameter-free Statistically Consistent Interpolation: Dimension-independent Convergence Rates for Hilbert kernel regression

Previously, statistical textbook wisdom has held that interpolating nois...

Please sign up or login with your details

Forgot password? Click here to reset