Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets

07/02/2021
by   Hayeon Lee, et al.
0

Despite the success of recent Neural Architecture Search (NAS) methods on various tasks which have shown to output networks that largely outperform human-designed networks, conventional NAS methods have mostly tackled the optimization of searching for the network architecture for a single task (dataset), which does not generalize well across multiple tasks (datasets). Moreover, since such task-specific methods search for a neural architecture from scratch for every given task, they incur a large computational cost, which is problematic when the time and monetary budget are limited. In this paper, we propose an efficient NAS framework that is trained once on a database consisting of datasets and pretrained networks and can rapidly search for a neural architecture for a novel dataset. The proposed MetaD2A (Meta Dataset-to-Architecture) model can stochastically generate graphs (architectures) from a given set (dataset) via a cross-modal latent space learned with amortized meta-learning. Moreover, we also propose a meta-performance predictor to estimate and select the best architecture without direct training on target datasets. The experimental results demonstrate that our model meta-learned on subsets of ImageNet-1K and architectures from NAS-Bench 201 search space successfully generalizes to multiple unseen datasets including CIFAR-10 and CIFAR-100, with an average search time of 33 GPU seconds. Even under MobileNetV3 search space, MetaD2A is 5.5K times faster than NSGANetV2, a transferable NAS method, with comparable performance. We believe that the MetaD2A proposes a new research direction for rapid NAS as well as ways to utilize the knowledge from rich databases of datasets and architectures accumulated over the past years. Code is available at https://github.com/HayeonLee/MetaD2A.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2023

Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets

Distillation-aware Neural Architecture Search (DaNAS) aims to search for...
research
03/03/2020

BATS: Binary ArchitecTure Search

This paper proposes Binary ArchitecTure Search (BATS), a framework that ...
research
03/02/2021

Task-Adaptive Neural Network Retrieval with Meta-Contrastive Learning

Most conventional Neural Architecture Search (NAS) approaches are limite...
research
01/31/2023

NASiam: Efficient Representation Learning using Neural Architecture Search for Siamese Networks

Siamese networks are one of the most trending methods to achieve self-su...
research
11/09/2019

Learning to reinforcement learn for Neural Architecture Search

Reinforcement learning (RL) is a goal-oriented learning solution that ha...
research
05/12/2022

Warm-starting DARTS using meta-learning

Neural architecture search (NAS) has shown great promise in the field of...
research
05/26/2023

DiffusionNAG: Task-guided Neural Architecture Generation with Diffusion Models

Neural Architecture Search (NAS) has emerged as a powerful technique for...

Please sign up or login with your details

Forgot password? Click here to reset