A resource-efficient method for repeated HPO and NAS problems

03/30/2021
by   Giovanni Zappella, et al.
0

In this work we consider the problem of repeated hyperparameter and neural architecture search (HNAS). We propose an extension of Successive Halving that is able to leverage information gained in previous HNAS problems with the goal of saving computational resources. We empirically demonstrate that our solution is able to drastically decrease costs while maintaining accuracy and being robust to negative transfer. Our method is significantly simpler than competing transfer learning approaches, setting a new baseline for transfer learning in HNAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2021

Conceptual Expansion Neural Architecture Search (CENAS)

Architecture search optimizes the structure of a neural network for some...
research
07/14/2022

PASHA: Efficient HPO with Progressive Resource Allocation

Hyperparameter optimization (HPO) and neural architecture search (NAS) a...
research
10/30/2017

Transfer Learning to Learn with Multitask Neural Model Search

Deep learning models require extensive architecture design exploration a...
research
10/02/2021

Transfer Learning Approaches for Knowledge Discovery in Grid-based Geo-Spatiotemporal Data

Extracting and meticulously analyzing geo-spatiotemporal features is cru...
research
07/20/2020

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task...
research
06/16/2020

Fine-Tuning DARTS for Image Classification

Neural Architecture Search (NAS) has gained attraction due to superior c...
research
10/04/2022

Toward Edge-Efficient Dense Predictions with Synergistic Multi-Task Neural Architecture Search

In this work, we propose a novel and scalable solution to address the ch...

Please sign up or login with your details

Forgot password? Click here to reset