Abstract
Online Unsupervised Domain Adaptation (OUDA) for person Re-Identification (Re-ID) is the task of continuously adapting a model trained on a well-annotated source domain dataset to a target domain observed as a data stream. In OUDA, person Re-ID models face two main challenges: catastrophic forgetting and domain shift. In this work, we propose a new Source-guided Similarity Preservation (S2P) framework to alleviate these two problems. Our framework is based on the extraction of a support set composed of source images that maximizes the similarity with the target data. This support set is used to identify feature similarities that must be preserved during the learning process. S2P can incorporate multiple existing UDA methods to mitigate catastrophic forgetting. Our experiments show that S2P outperforms previous state-of-the-art methods on multiple real-to-real and synthetic-to-real challenging OUDA benchmarks.
Abstract (translated)
在OUDA中,在线无监督领域适应(OUDA)对人物识别(Re-ID)的任务是对一个在良好注释的源域数据集上训练的模型,将其应用于观察到的目标域数据流中。在OUDA中,人物Re-ID模型面临着两个主要挑战:灾难性遗忘和领域转移。在这项工作中,我们提出了一种新的源指导相似性保留(S2P)框架来缓解这两个问题。我们的框架基于从源图像中提取一个支持集,该支持集与目标数据具有最大相似性。这个支持集用于在学习过程中保留必须保留的特征相似性。S2P可以结合多个现有的UDA方法来减轻灾难性遗忘。我们的实验结果表明,S2P在多个真实与真实和合成与真实具有挑战性的OUDA基准上超过了最先进的水平。
URL
https://arxiv.org/abs/2402.15206