HyperJump: Accelerating HyperBand via Risk Modelling

08/05/2021
by   Pedro Mendes, et al.
0

In the literature on hyper-parameter tuning, a number of recent solutions rely on low-fidelity observations (e.g., training with sub-sampled datasets or for short periods of time) to extrapolate good configurations to use when performing full training. Among these, HyperBand is arguably one of the most popular solutions, due to its efficiency and theoretically provable robustness. In this work, we introduce HyperJump, a new approach that builds on HyperBand's robust search strategy and complements it with novel model-based risk analysis techniques that accelerate the search by jumping the evaluation of low risk configurations, i.e., configurations that are likely to be discarded by HyperBand. We evaluate HyperJump on a suite of hyper-parameter optimization problems and show that it provides over one-order of magnitude speed-ups on a variety of deep-learning and kernel-based learning problems when compared to HyperBand as well as to a number of state of the art optimizers.

READ FULL TEXT

page 16

page 19

research
03/21/2016

Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

Performance of machine learning algorithms depends critically on identif...
research
04/10/2020

A Modified Bayesian Optimization based Hyper-Parameter Tuning Approach for Extreme Gradient Boosting

It is already reported in the literature that the performance of a machi...
research
10/24/2018

Noisy Blackbox Optimization with Multi-Fidelity Queries: A Tree Search Approach

We study the problem of black-box optimization of a noisy function in th...
research
03/19/2020

Faster SVM Training via Conjugate SMO

We propose an improved version of the SMO algorithm for training classif...
research
03/26/2018

Efficient Image Dataset Classification Difficulty Estimation for Predicting Deep-Learning Accuracy

In the deep-learning community new algorithms are published at an incred...
research
11/09/2020

TrimTuner: Efficient Optimization of Machine Learning Jobs in the Cloud via Sub-Sampling

This work introduces TrimTuner, the first system for optimizing machine ...
research
02/01/2019

Hyper-parameter Tuning under a Budget Constraint

We study a budgeted hyper-parameter tuning problem, where we optimize th...

Please sign up or login with your details

Forgot password? Click here to reset