RFAConv: Innovating Spatital Attention and Standard Convolutional Operation

04/06/2023
by   Xin Zhang, et al.
5

Spatial attention has been demonstrated to enable convolutional neural networks to focus on critical information to improve network performance, but it still has limitations. In this paper, we explain the effectiveness of spatial attention from a new perspective, it is that the spatial attention mechanism essentially solves the problem of convolutional kernel parameter sharing. However, the information contained in the attention map generated by spatial attention is still lacking for large-size convolutional kernels. So, we propose a new attention mechanism called Receptive-Field Attention (RFA). The Convolutional Block Attention Module (CBAM) and Coordinate Attention (CA) only focus on spatial features and cannot fully solve the problem of convolutional kernel parameter sharing, but in RFA, the receptive-field spatial feature not only is focused but also provide good attention weights for large-size convolutional kernels. The Receptive-Field Attention convolutional operation (RFAConv) designed by RFA can be considered a new way to replace the standard convolution and brings almost negligible computational cost and a number of parameters. Numerous experiments on Imagenet-1k, MS COCO, and VOC demonstrate the superior performance of our approach in classification, object detection, and semantic segmentation tasks. Importantly, we believe that for some current spatial attention mechanisms that focus only on spatial features, it is time to improve the performance of the network by focusing on receptive-field spatial features. The code and pre-trained models for the relevant tasks can be found at https://github.com/Liuchen1997/RFAConv

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2022

Rega-Net:Retina Gabor Attention for Deep Convolutional Neural Networks

Extensive research works demonstrate that the attention mechanism in con...
research
08/18/2021

An Attention Module for Convolutional Neural Networks

Attention mechanism has been regarded as an advanced technique to captur...
research
07/26/2021

Log-Polar Space Convolution for Convolutional Neural Networks

Convolutional neural networks use regular quadrilateral convolution kern...
research
10/14/2022

Parameter-Free Average Attention Improves Convolutional Neural Network Performance (Almost) Free of Charge

Visual perception is driven by the focus on relevant aspects in the surr...
research
08/17/2021

Contextual Convolutional Neural Networks

We propose contextual convolution (CoConv) for visual recognition. CoCon...
research
03/20/2018

Dynamic Sampling Convolutional Neural Networks

We present Dynamic Sampling Convolutional Neural Networks (DSCNN), where...
research
03/28/2023

LinK: Linear Kernel for LiDAR-based 3D Perception

Extending the success of 2D Large Kernel to 3D perception is challenging...

Please sign up or login with your details

Forgot password? Click here to reset