Fast Approximate Multi-output Gaussian Processes
Gaussian processes regression models are an appealing machine learning method as they learn expressive non-linear models from exemplar data with minimal parameter tuning and estimate both the mean and covariance of unseen points. However, exponential computational complexity growth with the number of training samples has been a long standing challenge. During training, one has to compute and invert an N × N kernel matrix at every iteration. Regression requires computation of an m × N kernel where N and m are the number of training and test points respectively. In this work we show how approximating the covariance kernel using eigenvalues and functions leads to an approximate Gaussian process with significant reduction in training and regression complexity. Training with the proposed approach requires computing only a N × n eigenfunction matrix and a n × n inverse where n is a selected number of eigenvalues. Furthermore, regression now only requires an m × n matrix. Finally, in a special case the hyperparameter optimization is completely independent form the number of training samples. The proposed method can regress over multiple outputs, estimate the derivative of the regressor of any order, and learn the correlations between them. The computational complexity reduction, regression capabilities, and multioutput correlation learning are demonstrated in simulation examples.
READ FULL TEXT