DeepAI AI Chat
Log In Sign Up

Kernel quadrature with DPPs

by   Ayoub Belhadji, et al.

We study quadrature rules for functions living in an RKHS, using nodes sampled from a projection determinantal point process (DPP). DPPs are parametrized by a kernel, and we use a truncated and saturated version of the RKHS kernel. This natural link between the two kernels, along with DPP machinery, leads to relatively tight bounds on the quadrature error, that depend on the spectrum of the RKHS kernel. Finally, we experimentally compare DPPs to existing kernel-based quadratures such as herding, Bayesian quadrature, or continuous leverage score sampling. Numerical results confirm the interest of DPPs, and even suggest faster rates than our bounds in particular cases.


Kernel Quadrature with Randomly Pivoted Cholesky

This paper presents new quadrature rules for functions in a reproducing ...

Kernel interpolation with continuous volume sampling

A fundamental task in kernel methods is to pick nodes and weights, so as...

Positively Weighted Kernel Quadrature via Subsampling

We study kernel quadrature rules with positive weights for probability m...

Super-Samples from Kernel Herding

We extend the herding algorithm to continuous spaces by using the kernel...

An analysis of Ermakov-Zolotukhin quadrature using kernels

We study a quadrature, proposed by Ermakov and Zolotukhin in the sixties...

Relative concentration bounds for the kernel matrix spectrum

In this paper, we study the concentration properties of the kernel matri...

Empirical Notes on the Interaction Between Continuous Kernel Fuzzing and Development

Fuzzing has been studied and applied ever since the 1990s. Automated and...