Abstract
Cloth-Changing Person Re-Identification (CC-ReID) aims to accurately identify the target person in more realistic surveillance scenarios, where pedestrians usually change their clothing. Despite great progress, limited cloth-changing training samples in existing CC-ReID datasets still prevent the model from adequately learning cloth-irrelevant features. In addition, due to the absence of explicit supervision to keep the model constantly focused on cloth-irrelevant areas, existing methods are still hampered by the disruption of clothing variations. To solve the above issues, we propose an Identity-aware Dual-constraint Network (IDNet) for the CC-ReID task. Specifically, to help the model extract cloth-irrelevant clues, we propose a Clothes Diversity Augmentation (CDA), which generates more realistic cloth-changing samples by enriching the clothing color while preserving the texture. In addition, a Multi-scale Constraint Block (MCB) is designed, which extracts fine-grained identity-related features and effectively transfers cloth-irrelevant knowledge. Moreover, a Counterfactual-guided Attention Module (CAM) is presented, which learns cloth-irrelevant features from channel and space dimensions and utilizes the counterfactual intervention for supervising the attention map to highlight identity-related regions. Finally, a Semantic Alignment Constraint (SAC) is designed to facilitate high-level semantic feature interaction. Comprehensive experiments on four CC-ReID datasets indicate that our method outperforms prior state-of-the-art approaches.
Abstract (translated)
衣物变化人员重新识别(CC-ReID)的目的是在更真实的监视场景中准确地识别目标人物,其中行人通常会改变衣服。尽管取得了很大进展,但现有的CC-ReID数据集中有限的换衣训练样本还是无法使模型充分学习与衣物无关的特征。此外,由于缺乏明确监督来保持模型持续关注衣物无关区域,现有方法仍然受到服装变化的影响。为解决上述问题,我们提出了一个具有身份感的学习双约束网络(IDNet)用于CC-ReID任务。具体来说,为了帮助模型提取与衣物无关的线索,我们提出了一个衣物多样性增强(CDA),它通过增加衣服颜色的同时保留纹理来生成更逼真的换衣样本。此外,还设计了一个多尺度约束块(MCB),它提取细粒度的身份相关特征,并有效地将衣物无关知识转移。此外,我们还提出了一个反事实引导的注意力模块(CAM),它从通道和空间维度学习衣物无关特征,并利用反事实干预来监督注意力图以突出身份相关区域。最后,我们还设计了一个语义对齐约束(SAC)来促进高级语义特征交互。对四个CC-ReID数据集的全面实验表明,我们的方法超越了先前最先进的解决方案。
URL
https://arxiv.org/abs/2403.08270