ICDARTS: Improving the Stability and Performance of Cyclic DARTS

by   Emily Herron, et al.

This work introduces improvements to the stability and generalizability of Cyclic DARTS (CDARTS). CDARTS is a Differentiable Architecture Search (DARTS)-based approach to neural architecture search (NAS) that uses a cyclic feedback mechanism to train search and evaluation networks concurrently. This training protocol aims to optimize the search process by enforcing that the search and evaluation networks produce similar outputs. However, CDARTS introduces a loss function for the evaluation network that is dependent on the search network. The dissimilarity between the loss functions used by the evaluation networks during the search and retraining phases results in a search-phase evaluation network that is a sub-optimal proxy for the final evaluation network that is utilized during retraining. We present ICDARTS, a revised approach that eliminates the dependency of the evaluation network weights upon those of the search network, along with a modified process for discretizing the search network's zero operations that allows these operations to be retained in the final evaluation networks. We pair the results of these changes with ablation studies on ICDARTS' algorithm and network template. Finally, we explore methods for expanding the search space of ICDARTS by expanding its operation set and exploring alternate methods for discretizing its continuous search cells. These experiments resulted in networks with improved generalizability and the implementation of a novel method for incorporating a dynamic search space into ICDARTS.


Cyclic Differentiable Architecture Search

Recently, differentiable architecture search has draw great attention du...

ASAP: Architecture Search, Anneal and Prune

Automatic methods for Neural Architecture Search (NAS) have been shown t...

De-IReps: Searching for improved Re-parameterizing Architecture based on Differentiable Evolution Strategy

In recent years, neural architecture search (NAS) has shown great compet...

Efficient Model Performance Estimation via Feature Histories

An important step in the task of neural network design, such as hyper-pa...

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

Continuous Ant-based Topology Search (CANTS) is a previously introduced ...

iDARTS: Improving DARTS by Node Normalization and Decorrelation Discretization

Differentiable ARchiTecture Search (DARTS) uses a continuous relaxation ...

Extended Symmetry Preserving Attention Networks for LHC Analysis

Reconstructing unstable heavy particles requires sophisticated technique...

Please sign up or login with your details

Forgot password? Click here to reset