Kernel quadrature with DPPs

06/18/2019
by   Ayoub Belhadji, et al.
0

We study quadrature rules for functions living in an RKHS, using nodes sampled from a projection determinantal point process (DPP). DPPs are parametrized by a kernel, and we use a truncated and saturated version of the RKHS kernel. This natural link between the two kernels, along with DPP machinery, leads to relatively tight bounds on the quadrature error, that depend on the spectrum of the RKHS kernel. Finally, we experimentally compare DPPs to existing kernel-based quadratures such as herding, Bayesian quadrature, or continuous leverage score sampling. Numerical results confirm the interest of DPPs, and even suggest faster rates than our bounds in particular cases.

READ FULL TEXT
research
06/06/2023

Kernel Quadrature with Randomly Pivoted Cholesky

This paper presents new quadrature rules for functions in a reproducing ...
research
02/22/2020

Kernel interpolation with continuous volume sampling

A fundamental task in kernel methods is to pick nodes and weights, so as...
research
07/20/2021

Positively Weighted Kernel Quadrature via Subsampling

We study kernel quadrature rules with positive weights for probability m...
research
03/15/2012

Super-Samples from Kernel Herding

We extend the herding algorithm to continuous spaces by using the kernel...
research
09/03/2023

An analysis of Ermakov-Zolotukhin quadrature using kernels

We study a quadrature, proposed by Ermakov and Zolotukhin in the sixties...
research
12/05/2018

Relative concentration bounds for the kernel matrix spectrum

In this paper, we study the concentration properties of the kernel matri...
research
09/05/2019

Empirical Notes on the Interaction Between Continuous Kernel Fuzzing and Development

Fuzzing has been studied and applied ever since the 1990s. Automated and...

Please sign up or login with your details

Forgot password? Click here to reset