DeepAI AI Chat
Log In Sign Up

Iterative Deepening Hyperband

by   Jasmin Brandt, et al.

Hyperparameter optimization (HPO) is concerned with the automated search for the most appropriate hyperparameter configuration (HPC) of a parameterized machine learning algorithm. A state-of-the-art HPO method is Hyperband, which, however, has its own parameters that influence its performance. One of these parameters, the maximal budget, is especially problematic: If chosen too small, the budget needs to be increased in hindsight and, as Hyperband is not incremental by design, the entire algorithm must be re-run. This is not only costly but also comes with a loss of valuable knowledge already accumulated. In this paper, we propose incremental variants of Hyperband that eliminate these drawbacks, and show that these variants satisfy theoretical guarantees qualitatively similar to those for the original Hyperband with the "right" budget. Moreover, we demonstrate their practical utility in experiments with benchmark data sets.


page 1

page 2

page 3

page 4


ExperienceThinking: Hyperparameter Optimization with Budget Constraints

The problem of hyperparameter optimization exists widely in the real lif...

Dynamic and Efficient Gray-Box Hyperparameter Optimization for Deep Learning

Gray-box hyperparameter optimization techniques have recently emerged as...

Transferable Neural Processes for Hyperparameter Optimization

Automated machine learning aims to automate the whole process of machine...

Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

Despite the evolution of Convolutional Neural Networks (CNNs), their per...

Stabilizing Bi-Level Hyperparameter Optimization using Moreau-Yosida Regularization

This research proposes to use the Moreau-Yosida envelope to stabilize th...

AC-Band: A Combinatorial Bandit-Based Approach to Algorithm Configuration

We study the algorithm configuration (AC) problem, in which one seeks to...