Abstract
Class-incremental learning (CIL) thrives due to its success in processing the influx of information by learning from continuously added new classes while preventing catastrophic forgetting about the old ones. It is essential for the performance breakthrough of CIL to effectively refine past knowledge from the base model and balance it with new learning. However, such an issue has not yet been considered in current research. In this work, we explore the potential of CIL from these perspectives and propose a novel balanced residual distillation framework (BRD-CIL) to push the performance bar of CIL to a new higher level. Specifically, BRD-CIL designs a residual distillation learning strategy, which can dynamically expand the network structure to capture the residuals between the base and target models, effectively refining the past knowledge. Furthermore, BRD-CIL designs a balanced pseudo-label learning strategy by generating a guidance mask to reduce the preference for old classes, ensuring balanced learning from new and old classes. We apply the proposed BRD-CIL to a challenging 3D point cloud semantic segmentation task where the data are unordered and unstructured. Extensive experimental results demonstrate that BRD-CIL sets a new benchmark with an outstanding balance capability in class-biased scenarios.
Abstract (translated)
分类级学习(CIL)之所以能够茁壮发展,是因为其在处理大量新类别的信息的同时,通过连续添加新类别来避免灾难性遗忘关于旧知识。对于CIL的性能突破,有效地从基础模型中精炼过去的知识并将其与新的学习相结合至关重要。然而,在当前的研究中,这个问题尚未被考虑。在这篇工作中,我们从这些角度探讨了CIL的潜力,并提出了一种新颖的平衡残差蒸馏框架(BRD-CIL),以将CIL的性能推向更高的层次。 具体来说,BRD-CIL设计了一个残差蒸馏学习策略,可以动态地扩展网络结构,捕捉基础模型和目标模型之间的残差,有效精炼过去的知识。此外,BRD-CIL还设计了一个平衡伪标签学习策略,通过生成指导掩码来减少对旧类别的偏好,确保从新旧类别之间实现平衡学习。我们将所提出的BRD-CIL应用于一个具有挑战性的3D点云语义分割任务,其中数据无序且无结构。大量的实验结果表明,BRD-CIL在分类偏见的场景中设置了一个新的基准,具有出色的平衡能力。
URL
https://arxiv.org/abs/2408.01356