NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

10/06/2022
by   Arjun Krishnakumar, et al.
0

Zero-cost proxies (ZC proxies) are a recent architecture performance prediction technique aiming to significantly speed up algorithms for neural architecture search (NAS). Recent work has shown that these techniques show great promise, but certain aspects, such as evaluating and exploiting their complementary strengths, are under-studied. In this work, we create NAS-Bench-Suite: we evaluate 13 ZC proxies across 28 tasks, creating by far the largest dataset (and unified codebase) for ZC proxies, enabling orders-of-magnitude faster experiments on ZC proxies, while avoiding confounding factors stemming from different implementations. To demonstrate the usefulness of NAS-Bench-Suite, we run a large-scale analysis of ZC proxies, including a bias analysis, and the first information-theoretic analysis which concludes that ZC proxies capture substantial complementary information. Motivated by these findings, we present a procedure to improve the performance of ZC proxies by reducing biases such as cell size, and we also show that incorporating all 13 ZC proxies into the surrogate models used by NAS algorithms can improve their predictive performance by up to 42 datasets are available at https://github.com/automl/naslib/tree/zerocost.

READ FULL TEXT

page 23

page 26

page 27

page 28

page 29

page 30

page 36

page 37

research
01/31/2022

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-2...
research
01/20/2021

Zero-Cost Proxies for Lightweight NAS

Neural Architecture Search (NAS) is quickly becoming the standard method...
research
11/05/2021

NAS-Bench-x11 and the Power of Learning Curves

While early research in neural architecture search (NAS) required extrem...
research
10/12/2021

NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search

Most existing neural architecture search (NAS) benchmarks and algorithms...
research
08/26/2021

Understanding and Accelerating Neural Architecture Search with Training-Free and Theory-Grounded Metrics

This work targets designing a principled and unified training-free frame...
research
11/02/2022

Speeding up NAS with Adaptive Subset Selection

A majority of recent developments in neural architecture search (NAS) ha...
research
07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...

Please sign up or login with your details

Forgot password? Click here to reset