Abstract
The channel attention mechanism is a useful technique widely employed in deep convolutional neural networks to boost the performance for image processing tasks, eg, image classification and image super-resolution. It is usually designed as a parameterized sub-network and embedded into the convolutional layers of the network to learn more powerful feature representations. However, current channel attention induces more parameters and therefore leads to higher computational costs. To deal with this issue, in this work, we propose a Parameter-Free Channel Attention (PFCA) module to boost the performance of popular image classification and image super-resolution networks, but completely sweep out the parameter growth of channel attention. Experiments on CIFAR-100, ImageNet, and DIV2K validate that our PFCA module improves the performance of ResNet on image classification and improves the performance of MSRResNet on image super-resolution tasks, respectively, while bringing little growth of parameters and FLOPs.
Abstract (translated)
通道注意力机制是深度学习卷积神经网络中广泛使用的一种有用技术,用于提高图像处理任务的性能,例如图像分类和图像超分辨率。通常被设计为一个参数化的子网络,并嵌入到网络的卷积层中,以学习更强大的特征表示。然而,当前的通道注意力机制导致更多的参数,因此导致更高的计算成本。为了解决这一问题,在本文中,我们提出了一个参数免费的通道注意力(PFCA)模块,以Boost popular图像分类和图像超分辨率网络的性能,但完全消除通道注意力参数的增长。在CIFAR-100、ImageNet和DIV2K等实验中,证明了我们的PFCA模块分别提高了ResNet在图像分类任务中的表现,并提高了MSRResNet在图像超分辨率任务中的表现,而参数和FLOPs几乎没有增长。
URL
https://arxiv.org/abs/2303.11055