A Copula approach for hyperparameter transfer learning

09/30/2019
by   David Salinas, et al.
0

Bayesian optimization (BO) is a popular methodology to tune the hyperparameters of expensive black-box functions. Despite its success, standard BO focuses on a single task at a time and is not designed to leverage information from related functions, such as tuning performance metrics of the same algorithm across multiple datasets. In this work, we introduce a novel approach to achieve transfer learning across different datasets as well as different metrics. The main idea is to regress the mapping from hyperparameter to metric quantiles with a semi-parametric Gaussian Copula distribution, which provides robustness against different scales or outliers that can occur in different tasks. We introduce two methods to leverage this estimation: a Thompson sampling strategy as well as a Gaussian Copula process using such quantile estimate as a prior. We show that these strategies can combine the estimation of multiple metrics such as runtime and accuracy, steering the optimization toward cheaper hyperparameters for the same level of accuracy. Experiments on an extensive set of hyperparameter tuning tasks demonstrate significant improvements over state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2019

Learning search spaces for Bayesian optimization: Another view of hyperparameter transfer learning

Bayesian optimization (BO) is a successful methodology to optimize black...
research
02/23/2020

Weighting Is Worth the Wait: Bayesian Optimization with Importance Sampling

Many contemporary machine learning models require extensive tuning of hy...
research
02/25/2021

Hyperparameter Transfer Learning with Adaptive Complexity

Bayesian optimization (BO) is a sample efficient approach to automatical...
research
05/26/2022

Towards Learning Universal Hyperparameter Optimizers with Transformers

Meta-learning hyperparameter optimization (HPO) algorithms from prior ex...
research
06/11/2021

HPO-B: A Large-Scale Reproducible Benchmark for Black-Box HPO based on OpenML

Hyperparameter optimization (HPO) is a core problem for the machine lear...
research
04/26/2019

A Novel Orthogonal Direction Mesh Adaptive Direct Search Approach for SVM Hyperparameter Tuning

In this paper, we propose the use of a black-box optimization method cal...
research
12/13/2020

Warm Starting CMA-ES for Hyperparameter Optimization

Hyperparameter optimization (HPO), formulated as black-box optimization ...

Please sign up or login with your details

Forgot password? Click here to reset