Variational Gaussian Processes with Signature Covariances

06/19/2019
by   Csaba Tóth, et al.
0

We introduce a Bayesian approach to learn from stream-valued data by using Gaussian processes with the recently introduced signature kernel as covariance function. To cope with the computational complexity in time and memory that arises with long streams that evolve in large state spaces, we develop a variational Bayes approach with sparse inducing tensors. We provide an implementation based on GPFlow and benchmark this variational Gaussian process model on supervised classification tasks for time series and text (a stream of words).

READ FULL TEXT
research
01/15/2020

Doubly Sparse Variational Gaussian Processes

The use of Gaussian process models is typically limited to datasets with...
research
05/10/2021

SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data

Making predictions and quantifying their uncertainty when the input data...
research
12/16/2009

Variational Inducing Kernels for Sparse Convolved Multiple Output Gaussian Processes

Interest in multioutput kernel methods is increasing, whether under the ...
research
07/13/2020

Orthogonally Decoupled Variational Fourier Features

Sparse inducing points have long been a standard method to fit Gaussian ...
research
10/27/2015

Blitzkriging: Kronecker-structured Stochastic Gaussian Processes

We present Blitzkriging, a new approach to fast inference for Gaussian p...
research
07/14/2021

Spectrum Gaussian Processes Based On Tunable Basis Functions

Spectral approximation and variational inducing learning for the Gaussia...
research
05/16/2013

Evolution of Covariance Functions for Gaussian Process Regression using Genetic Programming

In this contribution we describe an approach to evolve composite covaria...

Please sign up or login with your details

Forgot password? Click here to reset