BA-Net: Bridge Attention for Deep Convolutional Neural Networks

8 Dec 2021  ·  Yue Zhao, Junzhou Chen, Zirui Zhang, Ronghui Zhang ·

In recent years, channel attention mechanism has been widely investigated due to its great potential in improving the performance of deep convolutional neural networks (CNNs) in many vision tasks. However, in most of the existing methods, only the output of the adjacent convolution layer is fed into the attention layer for calculating the channel weights. Information from other convolution layers has been ignored. With these observations, a simple strategy, named Bridge Attention Net (BA-Net), is proposed in this paper for better performance with channel attention mechanisms. The core idea of this design is to bridge the outputs of the previous convolution layers through skip connections for channel weights generation. Based on our experiment and theory analysis, we find that features from previous layers also contribute to the weights significantly. The Comprehensive evaluation demonstrates that the proposed approach achieves state-of-the-art(SOTA) performance compared with the existing methods in accuracy and speed. which shows that Bridge Attention provides a new perspective on the design of neural network architectures with great potential in improving performance. The code is available at https://github.com/zhaoy376/Bridge-Attention.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods