HYPPO: A Surrogate-Based Multi-Level Parallelism Tool for Hyperparameter Optimization

by   Vincent Dumont, et al.

We present a new software, HYPPO, that enables the automatic tuning of hyperparameters of various deep learning (DL) models. Unlike other hyperparameter optimization (HPO) methods, HYPPO uses adaptive surrogate models and directly accounts for uncertainty in model predictions to find accurate and reliable models that make robust predictions. Using asynchronous nested parallelism, we are able to significantly alleviate the computational burden of training complex architectures and quantifying the uncertainty. HYPPO is implemented in Python and can be used with both TensorFlow and PyTorch libraries. We demonstrate various software features on time-series prediction and image classification problems as well as a scientific application in computed tomography image reconstruction. Finally, we show that (1) we can reduce by an order of magnitude the number of evaluations necessary to find the most optimal region in the hyperparameter space and (2) we can reduce by two orders of magnitude the throughput for such HPO process to complete.


page 4

page 6

page 8

page 9

page 10


Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT

A surrogate model based hyperparameter tuning approach for deep learning...

Model-based Asynchronous Hyperparameter Optimization

We introduce a model-based asynchronous multi-fidelity hyperparameter op...

Hyperboost: Hyperparameter Optimization by Gradient Boosting surrogate models

Bayesian Optimization is a popular tool for tuning algorithms in automat...

pySOT and POAP: An event-driven asynchronous framework for surrogate optimization

This paper describes Plumbing for Optimization with Asynchronous Paralle...

How to "DODGE" Complex Software Analytics?

AI software is still software. Software engineers need better tools to m...

Meta-Surrogate Benchmarking for Hyperparameter Optimization

Despite the recent progress in hyperparameter optimization (HPO), availa...

Python Wrapper for Simulating Multi-Fidelity Optimization on HPO Benchmarks without Any Wait

Hyperparameter (HP) optimization of deep learning (DL) is essential for ...

Please sign up or login with your details

Forgot password? Click here to reset