Composite Gaussian Processes: Scalable Computation and Performance Analysis

01/31/2018
by   Xiuming Liu, et al.
0

Gaussian process (GP) models provide a powerful tool for prediction but are computationally prohibitive using large data sets. In such scenarios, one has to resort to approximate methods. We derive an approximation based on a composite likelihood approach using a general belief updating framework, which leads to a recursive computation of the predictor as well as of learning the hyper-parameters. We then provide an analysis of the derived composite GP model in predictive and information-theoretic terms. Finally, we evaluate the approximation with both synthetic data and a real-world application.

READ FULL TEXT

page 6

page 7

page 8

research
11/02/2012

Deep Gaussian Processes

In this paper we introduce deep Gaussian process (GP) models. Deep GPs a...
research
06/19/2019

Multi-resolution Multi-task Gaussian Processes

We consider evidence integration from potentially dependent observation ...
research
05/15/2022

Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel

It is challenging to guide neural network (NN) learning with prior knowl...
research
10/31/2015

Gaussian Process Random Fields

Gaussian processes have been successful in both supervised and unsupervi...
research
01/24/2022

Design Strategies and Approximation Methods for High-Performance Computing Variability Management

Performance variability management is an active research area in high-pe...
research
11/15/2022

Provably Reliable Large-Scale Sampling from Gaussian Processes

When comparing approximate Gaussian process (GP) models, it can be helpf...
research
04/05/2016

Fast methods for training Gaussian processes on large data sets

Gaussian process regression (GPR) is a non-parametric Bayesian technique...

Please sign up or login with your details

Forgot password? Click here to reset