Abstract
Edge detection has attracted considerable attention thanks to its exceptional ability to enhance performance in downstream computer vision tasks. In recent years, various deep learning methods have been explored for edge detection tasks resulting in a significant performance improvement compared to conventional computer vision algorithms. In neural networks, edge detection tasks require considerably large receptive fields to provide satisfactory performance. In a typical convolutional operation, such a large receptive field can be achieved by utilizing a significant number of consecutive layers, which yields deep network structures. Recently, a Multi-scale Tensorial Summation (MTS) factorization operator was presented, which can achieve very large receptive fields even from the initial layers. In this paper, we propose a novel MTS Dimensional Reduction (MTS-DR) module guided neural network, MTS-DR-Net, for the edge detection task. The MTS-DR-Net uses MTS layers, and corresponding MTS-DR blocks as a new backbone to remove redundant information initially. Such a dimensional reduction module enables the neural network to focus specifically on relevant information (i.e., necessary subspaces). Finally, a weight U-shaped refinement module follows MTS-DR blocks in the MTS-DR-Net. We conducted extensive experiments on two benchmark edge detection datasets: BSDS500 and BIPEDv2 to verify the effectiveness of our model. The implementation of the proposed MTS-DR-Net can be found at this https URL.
Abstract (translated)
边缘检测因其在下游计算机视觉任务中增强性能的卓越能力而引起了极大的关注。近年来,各种深度学习方法被探索用于边缘检测任务,并且与传统计算机视觉算法相比,在这些任务上取得了显著的性能提升。在神经网络中,边缘检测任务需要相当大的感受野才能提供令人满意的性能。在一个典型的卷积操作中,可以通过利用大量连续层来实现这样的大感受野,从而产生深度网络结构。最近,提出了多尺度张量求和(MTS)因子化算子,它可以在初始层就达到非常大的感受野。在本文中,我们为边缘检测任务提出了一种新颖的基于MTS维度减少(MTS-DR)模块引导的神经网络,即MTS-DR-Net。MTS-DR-Net使用MTS层和相应的MTS-DR块作为新的骨干结构来删除冗余信息。这样的维度减少模块使神经网络能够专注于相关信息(例如,必要的子空间)。最后,在MTS-DR-Net中,一个权重U形细化模块跟随在MTS-DR块之后。我们在两个基准边缘检测数据集:BSDS500和BIPEDv2上进行了广泛的实验以验证我们模型的有效性。提出的MTS-DR-Net的实现可以在以下链接找到:[提供链接的位置]。 请注意,最后提到的URL需要用户根据具体情况进行补充或替换为实际可用的链接地址。
URL
https://arxiv.org/abs/2504.15770