DeepAI AI Chat
Log In Sign Up

Multiresolution Kernel Approximation for Gaussian Process Regression

by   Yi Ding, et al.

Gaussian process regression generally does not scale to beyond a few thousands data points without applying some sort of kernel approximation method. Most approximations focus on the high eigenvalue part of the spectrum of the kernel matrix, K, which leads to bad performance when the length scale of the kernel is small. In this paper we introduce Multiresolution Kernel Approximation (MKA), the first true broad bandwidth kernel approximation algorithm. Important points about MKA are that it is memory efficient, and it is a direct method, which means that it also makes it easy to approximate K^-1 and det(K).


page 1

page 2

page 3

page 4


Gauss-Legendre Features for Gaussian Process Regression

Gaussian processes provide a powerful probabilistic kernel learning fram...

Distributed Adaptive Sampling for Kernel Matrix Approximation

Most kernel-based methods, such as kernel or Gaussian process regression...

A Randomized Approach to Efficient Kernel Clustering

Kernel-based K-means clustering has gained popularity due to its simplic...

Online structural kernel selection for mobile health

Motivated by the need for efficient and personalized learning in mobile ...

Local Random Feature Approximations of the Gaussian Kernel

A fundamental drawback of kernel-based statistical models is their limit...

Improved Asymptotics for Zeros of Kernel Estimates via a Reformulation of the Leadbetter-Cryer Integral

The expected number of false inflection points of kernel smoothers is ev...

On Random Subsampling of Gaussian Process Regression: A Graphon-Based Analysis

In this paper, we study random subsampling of Gaussian process regressio...