Interpolation with the polynomial kernels

by   Giacomo Elefante, et al.

The polynomial kernels are widely used in machine learning and they are one of the default choices to develop kernel-based classification and regression models. However, they are rarely used and considered in numerical analysis due to their lack of strict positive definiteness. In particular they do not enjoy the usual property of unisolvency for arbitrary point sets, which is one of the key properties used to build kernel-based interpolation methods. This paper is devoted to establish some initial results for the study of these kernels, and their related interpolation algorithms, in the context of approximation theory. We will first prove necessary and sufficient conditions on point sets which guarantee the existence and uniqueness of an interpolant. We will then study the Reproducing Kernel Hilbert Spaces (or native spaces) of these kernels and their norms, and provide inclusion relations between spaces corresponding to different kernel parameters. With these spaces at hand, it will be further possible to derive generic error estimates which apply to sufficiently smooth functions, thus escaping the native space. Finally, we will show how to employ an efficient stable algorithm to these kernels to obtain accurate interpolants, and we will test them in some numerical experiment. After this analysis several computational and theoretical aspects remain open, and we will outline possible further research directions in a concluding section. This work builds some bridges between kernel and polynomial interpolation, two topics to which the authors, to different extents, have been introduced under the supervision or through the work of Stefano De Marchi. For this reason, they wish to dedicate this work to him in the occasion of his 60th birthday.


page 1

page 2

page 3

page 4


Approximation with Conditionally Positive Definite Kernels on Deficient Sets

Interpolation and approximation of functionals with conditionally positi...

Stability of convergence rates: Kernel interpolation on non-Lipschitz domains

Error estimates for kernel interpolation in Reproducing Kernel Hilbert S...

Extending error bounds for radial basis function interpolation to measuring the error in higher order Sobolev norms

Radial basis functions (RBFs) are prominent examples for reproducing ker...

Gravitational wave surrogates through automated machine learning

We analyze a prospect for predicting gravitational waveforms from compac...

Native Banach spaces for splines and variational inverse problems

We propose a systematic construction of native Banach spaces for general...

Absolute integrability of Mercer kernels is only sufficient for RKHS stability

Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces in ...

Kernel absolute summability is only sufficient for RKHS stability

Regularized approaches have been successfully applied to linear system i...

Please sign up or login with your details

Forgot password? Click here to reset