Inter-choice dependent super-network weights

04/23/2021
by   Kevin Alexander Laube, et al.
0

The automatic design of architectures for neural networks, Neural Architecture Search, has gained a lot of attention over the recent years, as the thereby created networks repeatedly broke state-of-the-art results for several disciplines. The network search spaces are often finite and designed by hand, in a way that a fixed and small number of decisions constitute a specific architecture. Given these circumstances, inter-choice dependencies are likely to exist and affect the network search, but are unaccounted for in the popular one-shot methods. We extend the Single-Path One-Shot search-networks with additional weights that depend on combinations of choices and analyze their effect. Experiments in NAS-Bench 201 and SubImageNet based search spaces show an improved super-network performance in only-convolutions settings and that the overhead is nearly negligible for sequential network designs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro