Partial Connection Based on Channel Attention for Differentiable Neural Architecture Search

08/01/2022
by   Yu Xue, et al.
0

Differentiable neural architecture search (DARTS), as a gradient-guided search method, greatly reduces the cost of computation and speeds up the search. In DARTS, the architecture parameters are introduced to the candidate operations, but the parameters of some weight-equipped operations may not be trained well in the initial stage, which causes unfair competition between candidate operations. The weight-free operations appear in large numbers which results in the phenomenon of performance crash. Besides, a lot of memory will be occupied during training supernet which causes the memory utilization to be low. In this paper, a partial channel connection based on channel attention for differentiable neural architecture search (ADARTS) is proposed. Some channels with higher weights are selected through the attention mechanism and sent into the operation space while the other channels are directly contacted with the processed channels. Selecting a few channels with higher attention weights can better transmit important feature information into the search space and greatly improve search efficiency and memory utilization. The instability of network structure caused by random selection can also be avoided. The experimental results show that ADARTS achieved 2.46 on CIFAR-10 and CIFAR-100, respectively. ADARTS can effectively solve the problem that too many skip connections appear in the search process and obtain network structures with better performance.

READ FULL TEXT
research
02/11/2023

Operation-level Progressive Differentiable Architecture Search

Differentiable Neural Architecture Search (DARTS) is becoming more and m...
research
11/24/2019

Exploiting Operation Importance for Differentiable Neural Architecture Search

Recently, differentiable neural architecture search methods significantl...
research
10/16/2020

G-DARTS-A: Groups of Channel Parallel Sampling with Attention

Differentiable Architecture Search (DARTS) provides a baseline for searc...
research
09/28/2021

Delve into the Performance Degradation of Differentiable Architecture Search

Differentiable architecture search (DARTS) is widely considered to be ea...
research
01/27/2022

DropNAS: Grouped Operation Dropout for Differentiable Architecture Search

Neural architecture search (NAS) has shown encouraging results in automa...
research
05/31/2021

Memory-Efficient Differentiable Transformer Architecture Search

Differentiable architecture search (DARTS) is successfully applied in ma...
research
04/10/2022

Enhancing the Robustness, Efficiency, and Diversity of Differentiable Architecture Search

Differentiable architecture search (DARTS) has attracted much attention ...

Please sign up or login with your details

Forgot password? Click here to reset