NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search

08/22/2020
by   Julien Siems, et al.
0

Neural Architecture Search (NAS) is a logical next step in the automatic learning of representations, but the development of NAS methods is slowed by high computational demands. As a remedy, several tabular NAS benchmarks were proposed to simulate runs of NAS methods in seconds. However, all existing NAS benchmarks are limited to extremely small architectural spaces since they rely on exhaustive evaluations of the space. This leads to unrealistic results, such as a strong performance of local search and random search, that do not transfer to larger search spaces. To overcome this fundamental limitation, we propose NAS-Bench-301, the first model-based surrogate NAS benchmark, using a search space containing 10^18 architectures, orders of magnitude larger than any previous NAS benchmark. We first motivate the benefits of using such a surrogate benchmark compared to a tabular one by smoothing out the noise stemming from the stochasticity of single SGD runs in a tabular benchmark. Then, we analyze our new dataset consisting of architecture evaluations and comprehensively evaluate various regression models as surrogates to demonstrate their capability to model the architecture space, also using deep ensembles to model uncertainty. Finally, we benchmark a wide range of NAS algorithms using NAS-Bench-301 allowing us to obtain comparable results to the true benchmark at a fraction of the cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Neural Architecture Performance Prediction Using Graph Neural Networks

In computer vision research, the process of automating architecture engi...
research
07/04/2021

Mutation is all you need

Neural architecture search (NAS) promises to make deep learning accessib...
research
01/30/2022

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-Diversity in Ensembles of Classifiers

Using Neuroevolution combined with Novelty Search to promote behavioural...
research
04/20/2020

Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Neural Architecture Search (NAS), i.e., the automation of neural network...
research
02/04/2022

Heed the Noise in Performance Evaluations in Neural Architecture Search

Neural Architecture Search (NAS) has recently become a topic of great in...
research
06/12/2023

Rethink DARTS Search Space and Renovate a New Benchmark

DARTS search space (DSS) has become a canonical benchmark for NAS wherea...
research
08/20/2021

Lessons from the Clustering Analysis of a Search Space: A Centroid-based Approach to Initializing NAS

Lots of effort in neural architecture search (NAS) research has been ded...

Please sign up or login with your details

Forgot password? Click here to reset