Optimization as Estimation with Gaussian Processes in Bandit Settings

by   Zi Wang, et al.

Recently, there has been rising interest in Bayesian optimization -- the optimization of an unknown function with assumptions usually expressed by a Gaussian Process (GP) prior. We study an optimization strategy that directly uses an estimate of the argmax of the function. This strategy offers both practical and theoretical advantages: no tradeoff parameter needs to be selected, and, moreover, we establish close connections to the popular GP-UCB and GP-PI strategies. Our approach can be understood as automatically and adaptively trading off exploration and exploitation in GP-UCB and GP-PI. We illustrate the effects of this adaptive tuning via bounds on the regret as well as an extensive empirical evaluation on robotics and vision tasks, demonstrating the robustness of this strategy for a range of performance criteria.


page 1

page 2

page 3

page 4


Using Distance Correlation for Efficient Bayesian Optimization

We propose a novel approach for Bayesian optimization, called GP-DC, whi...

Mixed Strategies for Robust Optimization of Unknown Objectives

We consider robust optimization problems, where the goal is to optimize ...

Learning Regions of Interest for Bayesian Optimization with Adaptive Level-Set Estimation

We study Bayesian optimization (BO) in high-dimensional and non-stationa...

Time-Varying Gaussian Process Bandit Optimization

We consider the sequential Bayesian optimization problem with bandit fee...

Event-Triggered Time-Varying Bayesian Optimization

We consider the problem of sequentially optimizing a time-varying object...

Near-linear Time Gaussian Process Optimization with Adaptive Batching and Resparsification

Gaussian processes (GP) are one of the most successful frameworks to mod...

A Probabilistic Machine Learning Approach to Scheduling Parallel Loops with Bayesian Optimization

This paper proposes Bayesian optimization augmented factoring self-sched...

Please sign up or login with your details

Forgot password? Click here to reset