Marginalising over Stationary Kernels with Bayesian Quadrature

06/14/2021
by   Saad Hamid, et al.
0

Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates. Existing approaches require likelihood evaluations of many kernels, rendering them prohibitively expensive for larger datasets. We propose a Bayesian Quadrature scheme to make this marginalisation more efficient and thereby more practical. Through use of the maximum mean discrepancies between distributions, we define a kernel over kernels that captures invariances between Spectral Mixture (SM) Kernels. Kernel samples are selected by generalising an information-theoretic acquisition function for warped Bayesian Quadrature. We show that our framework achieves more accurate predictions with better calibrated uncertainty than state-of-the-art baselines, especially when given limited (wall-clock) time budgets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2015

Generalized Spectral Kernels

In this paper we propose a family of tractable kernels that is dense in ...
research
02/02/2018

Scalable Lévy Process Priors for Spectral Kernel Learning

Gaussian processes are rich distributions over functions, with generaliz...
research
10/26/2015

The Human Kernel

Bayesian nonparametric models, such as Gaussian processes, provide a com...
research
11/02/2017

Fast Information-theoretic Bayesian Optimisation

Information-theoretic Bayesian optimisation techniques have demonstrated...
research
10/11/2019

Evolving Gaussian Process kernels from elementary mathematical expressions

Choosing the most adequate kernel is crucial in many Machine Learning ap...
research
06/16/2022

Scalable First-Order Bayesian Optimization via Structured Automatic Differentiation

Bayesian Optimization (BO) has shown great promise for the global optimi...
research
05/24/2021

Adaptive Local Kernels Formulation of Mutual Information with Application to Active Post-Seismic Building Damage Inference

The abundance of training data is not guaranteed in various supervised l...

Please sign up or login with your details

Forgot password? Click here to reset