DeepAI AI Chat
Log In Sign Up

In a Nutshell: Sequential Parameter Optimization

by   Thomas Bartz-Beielstein, et al.

The performance of optimization algorithms relies crucially on their parameterizations. Finding good parameter settings is called algorithm tuning. Using a simple simulated annealing algorithm, we will demonstrate how optimization algorithms can be tuned using the sequential parameter optimization toolbox (SPOT). SPOT provides several tools for automated and interactive tuning. The underling concepts of the SPOT approach are explained. This includes key techniques such as exploratory fitness landscape analysis and response surface methodology. Many examples illustrate how SPOT can be used for understanding the performance of algorithms and gaining insight into algorithm's behavior. Furthermore, we demonstrate how SPOT can be used as an optimizer and how a sophisticated ensemble approach is able to combine several meta models via stacking.


page 19

page 21

page 24

page 27

page 29

page 31

page 34

page 40


Review of Parameter Tuning Methods for Nature-Inspired Algorithms

Almost all optimization algorithms have algorithm-dependent parameters, ...

Tuning metaheuristics by sequential optimization of regression models

Tuning parameters is an important step for the application of metaheuris...

Making a Science of Model Search

Many computer vision algorithms depend on a variety of parameter choices...

DoE2Vec: Deep-learning Based Features for Exploratory Landscape Analysis

We propose DoE2Vec, a variational autoencoder (VAE)-based methodology to...

Parameter-free Online Linear Optimization with Side Information via Universal Coin Betting

A class of parameter-free online linear optimization algorithms is propo...