BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search

A myriad of recent breakthroughs in hand-crafted neural architectures for visual recognition have highlighted the urgent need to explore hybrid architectures consisting of diversified building blocks. Meanwhile, neural architecture search methods are surging with an expectation to reduce human efforts. However, whether NAS methods can efficiently and effectively handle diversified search spaces with disparate candidates (e.g. CNNs and transformers) is still an open question. In this work, we present Block-wisely Self-supervised Neural Architecture Search (BossNAS), an unsupervised NAS method that addresses the problem of inaccurate architecture rating caused by large weight-sharing space and biased supervision in previous methods. More specifically, we factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately before searching them as a whole towards the population center. Additionally, we present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions. On this challenging search space, our searched model, BossNet-T, achieves up to 82.5% accuracy on ImageNet, surpassing EfficientNet by 2.4% with comparable compute time. Moreover, our method achieves superior architecture rating accuracy with 0.78 and 0.76 Spearman correlation on the canonical MBConv search space with ImageNet and on NATS-Bench size search space with CIFAR-100, respectively, surpassing state-of-the-art NAS methods. Code: https://github.com/changlin31/BossNAS

PDF Abstract ICCV 2021 PDF ICCV 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification ImageNet BossNet-T1 Top 1 Accuracy 82.2% # 507
GFLOPs 15.8 # 344
Neural Architecture Search ImageNet BossNet-T1+ Top-1 Error Rate 17.8 # 6
Accuracy 82.2 # 4
MACs 10.5G # 5
Neural Architecture Search NATS-Bench Size, CIFAR-10 BossNAS Kendall's Tau 0.53 # 1
Spearman's Rho 0.73 # 1
Pearson R 0.72 # 1
Acc. (test) 93.29 # 1
Neural Architecture Search NATS-Bench Size, CIFAR-100 BossNAS Kendall's Tau 0.59 # 1
Spearman's Rho 0.76 # 1
Pearson R 0.79 # 1
Acc. (test) 70.86 # 1

Methods