Multiple Adaptive Bayesian Linear Regression for Scalable Bayesian Optimization with Warm Start

by   Valerio Perrone, et al.

Bayesian optimization (BO) is a model-based approach for gradient-free black-box function optimization. Typically, BO is powered by a Gaussian process (GP), whose algorithmic complexity is cubic in the number of evaluations. Hence, GP-based BO cannot leverage large amounts of past or related function evaluations, for example, to warm start the BO procedure. We develop a multiple adaptive Bayesian linear regression model as a scalable alternative whose complexity is linear in the number of observations. The multiple Bayesian linear regression models are coupled through a shared feedforward neural network, which learns a joint representation and transfers knowledge across machine learning problems.


page 1

page 2

page 3

page 4


Scalable Bayesian Optimization Using Vecchia Approximations of Gaussian Processes

Bayesian optimization is a technique for optimizing black-box target fun...

Neural Process for Black-Box Model Optimization Under Bayesian Framework

There are a large number of optimization problems in physical models whe...

Benchmarking the Neural Linear Model for Regression

The neural linear model is a simple adaptive Bayesian linear regression ...

Harnessing Low-Fidelity Data to Accelerate Bayesian Optimization via Posterior Regularization

Bayesian optimization (BO) is a powerful derivative-free technique for g...

Bayesian Optimization of Combinatorial Structures

The optimization of expensive-to-evaluate black-box functions over combi...

Ensemble Bayesian Optimization

Bayesian Optimization (BO) has been shown to be a very effective paradig...

Gaussian Process Optimization with Adaptive Sketching: Scalable and No Regret

Gaussian processes (GP) are a popular Bayesian approach for the optimiza...

Please sign up or login with your details

Forgot password? Click here to reset