Cost Effective Optimization for Cost-related Hyperparameters

by   Qingyun Wu, et al.

The increasing demand for democratizing machine learning algorithms for general software developers calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters, which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a cost effective HPO solution. The core of our solution is a new randomized direct-search method. We prove a convergence rate of O(√(d)/√(K)) and provide an analysis on how it can be used to control evaluation cost under reasonable assumptions. Extensive evaluation using a latest AutoML benchmark shows a strong any time performance of the proposed HPO method when tuning cost-related hyperparameters.


page 21

page 22

page 23

page 24

page 25

page 26

page 27

page 28


Experimental Investigation and Evaluation of Model-based Hyperparameter Optimization

Machine learning algorithms such as random forests or xgboost are gainin...

FLO: Fast and Lightweight Hyperparameter Optimization for AutoML

Integrating ML models in software is of growing interest. Building accur...

Stealing Hyperparameters in Machine Learning

Hyperparameters are critical in machine learning, as different hyperpara...

Continuation Path with Linear Convergence Rate

Path-following algorithms are frequently used in composite optimization ...

Accounting for Variance in Machine Learning Benchmarks

Strong empirical evidence that one machine-learning algorithm A outperfo...

An LP-based hyperparameter optimization model for language modeling

In order to find hyperparameters for a machine learning model, algorithm...

Please sign up or login with your details

Forgot password? Click here to reset