Efficient Forward Architecture Search

05/31/2019
by   Hanzhang Hu, et al.
4

We propose a neural architecture search (NAS) algorithm, Petridish, to iteratively add shortcut connections to existing network layers. The added shortcut connections effectively perform gradient boosting on the augmented layers. The proposed algorithm is motivated by the feature selection algorithm forward stage-wise linear regression, since we consider NAS as a generalization of feature selection for regression, where NAS selects shortcuts among layers instead of selecting features. In order to reduce the number of trials of possible connection combinations, we train jointly all possible connections at each stage of growth while leveraging feature selection techniques to choose a subset of them. We experimentally show this process to be an efficient forward architecture search algorithm that can find competitive models using few GPU days in both the search space of repeatable network modules (cell-search) and the space of general networks (macro-search). Petridish is particularly well-suited for warm-starting from existing models crucial for lifelong-learning scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2021

Exploring Complicated Search Spaces with Interleaving-Free Sampling

The existing neural architecture search algorithms are mostly working on...
research
07/25/2011

An iterative feature selection method for GRNs inference by exploring topological properties

An important problem in bioinformatics is the inference of gene regulato...
research
07/23/2020

Representation Sharing for Fast Object Detector Search and Beyond

Region Proposal Network (RPN) provides strong support for handling the s...
research
11/13/2021

Full-attention based Neural Architecture Search using Context Auto-regression

Self-attention architectures have emerged as a recent advancement for im...
research
03/08/2022

Beam Search for Feature Selection

In this paper, we present and prove some consistency results about the p...
research
08/27/2019

Feature Gradients: Scalable Feature Selection via Discrete Relaxation

In this paper we introduce Feature Gradients, a gradient-based search al...
research
12/30/2019

Searching for Stage-wise Neural Graphs In the Limit

Search space is a key consideration for neural architecture search. Rece...

Please sign up or login with your details

Forgot password? Click here to reset