Surrogate Model Based Hyperparameter Tuning for Deep Learning with SPOT

05/30/2021
by   Thomas Bartz-Beielstein, et al.
0

A surrogate model based hyperparameter tuning approach for deep learning is presented. This article demonstrates how the architecture-level parameters (hyperparameters) of deep learning models that were implemented in Keras/tensorflow can be optimized. The implementation of the tuning procedure is 100 few lines of code, existing R packages (tfruns and SPOT) can be combined to perform hyperparameter tuning. An elementary hyperparameter tuning task (neural network and the MNIST data) is used to exemplify this approach.

READ FULL TEXT
research
07/17/2023

Hyperparameter Tuning Cookbook: A guide for scikit-learn, PyTorch, river, and spotPython

This document provides a comprehensive guide to hyperparameter tuning us...
research
05/18/2022

Hyperparameter Optimization with Neural Network Pruning

Since the deep learning model is highly dependent on hyperparameters, hy...
research
10/04/2021

HYPPO: A Surrogate-Based Multi-Level Parallelism Tool for Hyperparameter Optimization

We present a new software, HYPPO, that enables the automatic tuning of h...
research
10/22/2018

LAMVI-2: A Visual Tool for Comparing and Tuning Word Embedding Models

Tuning machine learning models, particularly deep learning architectures...
research
10/19/2020

How much progress have we made in neural network training? A New Evaluation Protocol for Benchmarking Optimizers

Many optimizers have been proposed for training deep neural networks, an...
research
05/24/2021

Guided Hyperparameter Tuning Through Visualization and Inference

For deep learning practitioners, hyperparameter tuning for optimizing mo...
research
08/29/2021

CrossedWires: A Dataset of Syntactically Equivalent but Semantically Disparate Deep Learning Models

The training of neural networks using different deep learning frameworks...

Please sign up or login with your details

Forgot password? Click here to reset