Depth-Wise Neural Architecture Search

by   Artur Jordao, et al.

Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications. These architectures consist of stages, which are sets of layers that operate on representations in the same resolution. It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network. However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time. Thus, significant human effort is necessary to evaluate different trade-offs between depth and performance. To handle this problem, recent works have proposed to automatically design high-performance architectures, mainly by means of neural architecture search (NAS). Current NAS strategies analyze a large set of possible candidate architectures and, hence, require vast computational resources and take many GPUs days. Motivated by this, we propose a NAS approach to efficiently design accurate and low-cost convolutional architectures and demonstrate that an efficient strategy for designing these architectures is to learn the depth stage-by-stage. For this purpose, our approach increases depth incrementally in each stage taking into account its importance, such that stages with low importance are kept shallow while stages with high importance become deeper. We conduct experiments on the CIFAR and different versions of ImageNet datasets, where we show that architectures discovered by our approach achieve better accuracy and efficiency than human-designed architectures. Additionally, we show that architectures discovered on CIFAR-10 can be successfully transferred to large datasets. Compared to previous NAS approaches, our method is substantially more efficient, as it evaluates one order of magnitude fewer models and yields architectures on par with the state-of-the-art.


EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search

Neural architecture search (NAS) methods have been proposed to release h...

Search-time Efficient Device Constraints-Aware Neural Architecture Search

Edge computing aims to enable edge devices, such as IoT devices, to proc...

Towards Self-supervised and Weight-preserving Neural Architecture Search

Neural architecture search (NAS) algorithms save tremendous labor from h...

The Untapped Potential of Off-the-Shelf Convolutional Neural Networks

Over recent years, a myriad of novel convolutional network architectures...

Architecture Augmentation for Performance Predictor Based on Graph Isomorphism

Neural Architecture Search (NAS) can automatically design architectures ...

Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling

Neural architecture search (NAS) aims to automate architecture design pr...

AutoGrow: Automatic Layer Growing in Deep Convolutional Networks

We propose AutoGrow to automate depth discovery in Deep Neural Networks ...

Please sign up or login with your details

Forgot password? Click here to reset