Paper

Inter-choice dependent super-network weights

The automatic design of architectures for neural networks, Neural Architecture Search, has gained a lot of attention over the recent years, as the thereby created networks repeatedly broke state-of-the-art results for several disciplines. The network search spaces are often finite and designed by hand, in a way that a fixed and small number of decisions constitute a specific architecture. Given these circumstances, inter-choice dependencies are likely to exist and affect the network search, but are unaccounted for in the popular one-shot methods. We extend the Single-Path One-Shot search-networks with additional weights that depend on combinations of choices and analyze their effect. Experiments in NAS-Bench 201 and SubImageNet based search spaces show an improved super-network performance in only-convolutions settings and that the overhead is nearly negligible for sequential network designs.

Results in Papers With Code
(↓ scroll down to see all results)