A Survey on Multi-Objective Neural Architecture Search

07/18/2023
by   Seyed Mahdi Shariatzadeh, et al.
0

Recently, the expert-crafted neural architectures is increasing overtaken by the utilization of neural architecture search (NAS) and automatic generation (and tuning) of network structures which has a close relation to the Hyperparameter Optimization and Auto Machine Learning (AutoML). After the earlier NAS attempts to optimize only the prediction accuracy, Multi-Objective Neural architecture Search (MONAS) has been attracting attentions which considers more goals such as computational complexity, power consumption, and size of the network for optimization, reaching a trade-off between the accuracy and other features like the computational cost. In this paper, we present an overview of principal and state-of-the-art works in the field of MONAS. Starting from a well-categorized taxonomy and formulation for the NAS, we address and correct some miscategorizations in previous surveys of the NAS field. We also provide a list of all known objectives used and add a number of new ones and elaborate their specifications. We have provides analyses about the most important objectives and shown that the stochastic properties of some the them should be differed from deterministic ones in the multi-objective optimization procedure of NAS. We finalize this paper with a number of future directions and topics in the field of MONAS.

READ FULL TEXT

page 1

page 14

page 16

research
07/30/2022

Tackling Neural Architecture Search With Quality Diversity Optimization

Neural architecture search (NAS) has been studied extensively and has gr...
research
06/27/2018

MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Recent studies on neural architecture search have shown that automatical...
research
01/28/2021

Evolutionary Neural Architecture Search Supporting Approximate Multipliers

There is a growing interest in automated neural architecture search (NAS...
research
04/20/2020

Local Search is a Remarkably Strong Baseline for Neural Architecture Search

Neural Architecture Search (NAS), i.e., the automation of neural network...
research
05/08/2023

MO-DEHB: Evolutionary-based Hyperband for Multi-Objective Optimization

Hyperparameter optimization (HPO) is a powerful technique for automating...
research
09/14/2021

How to Simplify Search: Classification-wise Pareto Evolution for One-shot Neural Architecture Search

In the deployment of deep neural models, how to effectively and automati...
research
02/25/2022

Accelerating Neural Architecture Exploration Across Modalities Using Genetic Algorithms

Neural architecture search (NAS), the study of automating the discovery ...

Please sign up or login with your details

Forgot password? Click here to reset