Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates

07/28/2016
by   Ilija Ilievski, et al.
0

Automatically searching for optimal hyperparameter configurations is of crucial importance for applying deep learning algorithms in practice. Recently, Bayesian optimization has been proposed for optimizing hyperparameters of various machine learning algorithms. Those methods adopt probabilistic surrogate models like Gaussian processes to approximate and minimize the validation error function of hyperparameter values. However, probabilistic surrogates require accurate estimates of sufficient statistics (e.g., covariance) of the error distribution and thus need many function evaluations with a sizeable number of hyperparameters. This makes them inefficient for optimizing hyperparameters of deep learning algorithms, which are highly expensive to evaluate. In this work, we propose a new deterministic and efficient hyperparameter optimization method that employs radial basis functions as error surrogates. The proposed mixed integer algorithm, called HORD, searches the surrogate for the most promising hyperparameter values through dynamic coordinate search and requires many fewer function evaluations. HORD does well in low dimensions but it is exceptionally better in higher dimensions. Extensive evaluations on MNIST and CIFAR-10 for four deep neural networks demonstrate HORD significantly outperforms the well-established Bayesian optimization methods such as GP, SMAC, and TPE. For instance, on average, HORD is more than 6 times faster than GP-EI in obtaining the best configuration of 19 hyperparameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2020

Towards Automatic Bayesian Optimization: A first step involving acquisition functions

Bayesian Optimization is the state of the art technique for the optimiza...
research
01/02/2019

Multi-level CNN for lung nodule classification with Gaussian Process assisted hyperparameter optimization

This paper investigates lung nodule classification by using deep neural ...
research
02/20/2022

Dynamic and Efficient Gray-Box Hyperparameter Optimization for Deep Learning

Gray-box hyperparameter optimization techniques have recently emerged as...
research
11/06/2018

Fast Hyperparameter Optimization of Deep Neural Networks via Ensembling Multiple Surrogates

The performance of deep neural networks crucially depends on good hyperp...
research
01/19/2021

Few-Shot Bayesian Optimization with Deep Kernel Surrogates

Hyperparameter optimization (HPO) is a central pillar in the automation ...
research
01/17/2021

Cost-Efficient Online Hyperparameter Optimization

Recent work on hyperparameters optimization (HPO) has shown the possibil...
research
05/05/2023

Optimizing Hyperparameters with Conformal Quantile Regression

Many state-of-the-art hyperparameter optimization (HPO) algorithms rely ...

Please sign up or login with your details

Forgot password? Click here to reset