Abstract
The challenge of the Class Incremental Learning (CIL) lies in difficulty for a learner to discern the old classes' data from the new while no previous data is preserved. Namely, the representation distribution of different phases overlaps with each other. In this paper, to alleviate the phenomenon of representation overlapping for both memory-based and memory-free methods, we propose a new CIL framework, Contrastive Class Concentration for CIL (C4IL). Our framework leverages the class concentration effect of contrastive representation learning, therefore yielding a representation distribution with better intra-class compactibility and inter-class separability. Quantitative experiments showcase our framework that is effective in both memory-based and memory-free cases: it outperforms the baseline methods of both cases by 5% in terms of the average and top-1 accuracy in 10-phase and 20-phase CIL. Qualitative results also demonstrate that our method generates a more compact representation distribution that alleviates the overlapping problem.
Abstract (translated)
URL
https://arxiv.org/abs/2107.12308