FreeREA: Training-Free Evolution-based Architecture Search

06/17/2022
by   Niccolò Cavagnero, et al.
0

In the last decade, most research in Machine Learning contributed to the improvement of existing models, with the aim of increasing the performance of neural networks for the solution of a variety of different tasks. However, such advancements often come at the cost of an increase of model memory and computational requirements. This represents a significant limitation for the deployability of research output in realistic settings, where the cost, the energy consumption, and the complexity of the framework play a crucial role. To solve this issue, the designer should search for models that maximise the performance while limiting its footprint. Typical approaches to reach this goal rely either on manual procedures, which cannot guarantee the optimality of the final design, or upon Neural Architecture Search algorithms to automatise the process, at the expenses of extremely high computational time. This paper provides a solution for the fast identification of a neural network that maximises the model accuracy while preserving size and computational constraints typical of tiny devices. Our approach, named FreeREA, is a custom cell-based evolution NAS algorithm that exploits an optimised combination of training-free metrics to rank architectures during the search, thus without need of model training. Our experiments, carried out on the common benchmarks NAS-Bench-101 and NATS-Bench, demonstrate that i) FreeREA is the first method able to provide very accurate models in minutes of search time; ii) it outperforms State of the Art training-based and training-free techniques in all the datasets and benchmarks considered, and iii) it can easily generalise to constrained scenarios, representing a competitive solution for fast Neural Architecture Search in generic constrained applications.

READ FULL TEXT
research
01/17/2019

EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search

Neural architecture search (NAS) methods have been proposed to release h...
research
10/12/2022

Energy Consumption-Aware Tabular Benchmarks for Neural Architecture Search

The demand for large-scale computational resources for Neural Architectu...
research
09/02/2021

NASI: Label- and Data-agnostic Neural Architecture Search at Initialization

Recent years have witnessed a surging interest in Neural Architecture Se...
research
12/23/2022

DAS: Neural Architecture Search via Distinguishing Activation Score

Neural Architecture Search (NAS) is an automatic technique that can sear...
research
09/29/2020

MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search

Neural Architecture Search (NAS) has proved effective in offering outper...
research
07/01/2023

AutoST: Training-free Neural Architecture Search for Spiking Transformers

Spiking Transformers have gained considerable attention because they ach...
research
03/27/2023

TOFA: Transfer-Once-for-All

Weight-sharing neural architecture search aims to optimize a configurabl...

Please sign up or login with your details

Forgot password? Click here to reset