BayesNAS: A Bayesian Approach for Neural Architecture Search

05/13/2019
by   Hongpeng Zhou, et al.
9

One-Shot Neural Architecture Search (NAS) is a promising method to significantly reduce search time without any separate training. It can be treated as a Network Compression problem on the architecture parameters from an over-parameterized network. However, there are two issues associated with most one-shot NAS methods. First, dependencies between a node and its predecessors and successors are often disregarded which result in improper treatment over zero operations. Second, architecture parameters pruning based on their magnitude is questionable. In this paper, we employ the classic Bayesian learning approach to alleviate these two issues by modeling architecture parameters using hierarchical automatic relevance determination (HARD) priors. Unlike other NAS methods, we train the over-parameterized network for only one epoch then update the architecture. Impressively, this enabled us to find the architecture in both proxy and proxyless tasks on CIFAR-10 within only 0.2 GPU days using a single GPU. As a byproduct, our approach can be transferred directly to compress convolutional neural networks by enforcing structural sparsity which achieves extremely sparse networks without accuracy deterioration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2020

Evolving Neural Architecture Using One Shot Model

Neural Architecture Search (NAS) is emerging as a new research direction...
research
05/28/2019

Dynamic Distribution Pruning for Efficient Network Architecture Search

Network architectures obtained by Neural Architecture Search (NAS) have ...
research
06/23/2019

One-Shot Neural Architecture Search Through A Posteriori Distribution Guided Sampling

The emergence of one-shot approaches has greatly advanced the research o...
research
07/18/2019

XferNAS: Transfer Neural Architecture Search

The term Neural Architecture Search (NAS) refers to the automatic optimi...
research
07/27/2021

Experiments on Properties of Hidden Structures of Sparse Neural Networks

Sparsity in the structure of Neural Networks can lead to less energy con...
research
10/13/2020

ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse Coding

Neural architecture search (NAS) aims to produce the optimal sparse solu...
research
07/12/2020

VINNAS: Variational Inference-based Neural Network Architecture Search

In recent years, neural architecture search (NAS) has received intensive...

Please sign up or login with your details

Forgot password? Click here to reset