Sampling-based Nyström Approximation and Kernel Quadrature

01/23/2023
by   Satoshi Hayakawa, et al.
0

We analyze the Nyström approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nyström approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nyström approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2016

Recursive Sampling for the Nyström Method

We give the first algorithm for kernel Nyström approximation that runs i...
research
05/08/2023

The Signature Kernel

The signature kernel is a positive definite kernel for sequential data. ...
research
04/24/2023

Preconditioner Design via the Bregman Divergence

We study a preconditioner for a Hermitian positive definite linear syste...
research
10/09/2018

Data-dependent compression of random features for large-scale kernel approximation

Kernel methods offer the flexibility to learn complex relationships in m...
research
03/03/2022

Uniform Approximations for Randomized Hadamard Transforms with Applications

Randomized Hadamard Transforms (RHTs) have emerged as a computationally ...
research
08/08/2017

Improved Fixed-Rank Nyström Approximation via QR Decomposition: Practical and Theoretical Aspects

The Nyström method is a popular technique for computing fixed-rank appro...
research
07/06/2022

Gradient-Free Kernel Stein Discrepancy

Stein discrepancies have emerged as a powerful statistical tool, being a...

Please sign up or login with your details

Forgot password? Click here to reset