ExperienceThinking: Hyperparameter Optimization with Budget Constraints

12/02/2019
by   Chunnan Wang, et al.
0

The problem of hyperparameter optimization exists widely in the real life and many common tasks can be transformed into it, such as neural architecture search and feature subset selection. Without considering various constraints, the existing hyperparameter tuning techniques can solve these problems effectively by traversing as many hyperparameter configurations as possible. However, because of the limited resources and budget, it is not feasible to evaluate so many kinds of configurations, which requires us to design effective algorithms to find a best possible hyperparameter configuration with a finite number of configuration evaluations. In this paper, we simulate human thinking processes and combine the merit of the existing techniques, and thus propose a new algorithm called ExperienceThinking, trying to solve this constrained hyperparameter optimization problem. In addition, we analyze the performances of 3 classical hyperparameter optimization algorithms with a finite number of configuration evaluations, and compare with that of ExperienceThinking. The experimental results show that our proposed algorithm provides superior results and has better performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/05/2018

Combination of Hyperband and Bayesian Optimization for Hyperparameter Optimization in Deep Learning

Deep learning has achieved impressive results on many problems. However,...
research
02/01/2023

Iterative Deepening Hyperband

Hyperparameter optimization (HPO) is concerned with the automated search...
research
10/24/2019

Auto-Model: Utilizing Research Papers and HPO Techniques to Deal with the CASH problem

In many fields, a mass of algorithms with completely different hyperpara...
research
07/27/2020

Stabilizing Bi-Level Hyperparameter Optimization using Moreau-Yosida Regularization

This research proposes to use the Moreau-Yosida envelope to stabilize th...
research
01/08/2020

HyperSched: Dynamic Resource Reallocation for Model Development on a Deadline

Prior research in resource scheduling for machine learning training work...
research
03/06/2021

Convolution Neural Network Hyperparameter Optimization Using Simplified Swarm Optimization

Among the machine learning approaches applied in computer vision, Convol...
research
09/29/2022

Dynamic Surrogate Switching: Sample-Efficient Search for Factorization Machine Configurations in Online Recommendations

Hyperparameter optimization is the process of identifying the appropriat...

Please sign up or login with your details

Forgot password? Click here to reset