Fast Approximate Multi-output Gaussian Processes

08/22/2020
by   Vladimir Joukov, et al.
0

Gaussian processes regression models are an appealing machine learning method as they learn expressive non-linear models from exemplar data with minimal parameter tuning and estimate both the mean and covariance of unseen points. However, exponential computational complexity growth with the number of training samples has been a long standing challenge. During training, one has to compute and invert an N × N kernel matrix at every iteration. Regression requires computation of an m × N kernel where N and m are the number of training and test points respectively. In this work we show how approximating the covariance kernel using eigenvalues and functions leads to an approximate Gaussian process with significant reduction in training and regression complexity. Training with the proposed approach requires computing only a N × n eigenfunction matrix and a n × n inverse where n is a selected number of eigenvalues. Furthermore, regression now only requires an m × n matrix. Finally, in a special case the hyperparameter optimization is completely independent form the number of training samples. The proposed method can regress over multiple outputs, estimate the derivative of the regressor of any order, and learn the correlations between them. The computational complexity reduction, regression capabilities, and multioutput correlation learning are demonstrated in simulation examples.

READ FULL TEXT
research
04/29/2020

Sparse Cholesky factorization by Kullback-Leibler minimization

We propose to compute a sparse approximate inverse Cholesky factor L of ...
research
07/09/2018

Ensemble Kalman Filtering for Online Gaussian Process Regression and Learning

Gaussian process regression is a machine learning approach which has bee...
research
03/19/2018

Asymmetric kernel in Gaussian Processes for learning target variance

This work incorporates the multi-modality of the data distribution into ...
research
01/28/2021

Faster Kernel Interpolation for Gaussian Processes

A key challenge in scaling Gaussian Process (GP) regression to massive d...
research
03/11/2021

The Minecraft Kernel: Modelling correlated Gaussian Processes in the Fourier domain

In the univariate setting, using the kernel spectral representation is a...
research
11/19/2017

Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

This paper presents a sequential randomized lowrank matrix factorization...
research
01/05/2017

Overlapping Cover Local Regression Machines

We present the Overlapping Domain Cover (ODC) notion for kernel machines...

Please sign up or login with your details

Forgot password? Click here to reset