PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning.

Journal: IEEE Transactions On Pattern Analysis And Machine Intelligence
Published:
Abstract

Class-incremental learning (CIL) aims to continually recognize new classes while preserving the discriminability of previously learned ones. Most existing CIL methods are exemplar-based, relying on the storage and replay of a subset of old data during training. Without access to such data, these methods typically suffer from catastrophic forgetting. In this paper, we identify two fundamental causes of forgetting in CIL: representation bias and classifier bias. To address these challenges, we propose a simple yet effective dual-bias reduction framework, which leverages self-supervised transformation (SST) in the input space and prototype augmentation (protoAug) in the feature space. On one hand, SST mitigates representation bias by encouraging the model to learn generic, diverse representations that generalize across tasks. On the other hand, protoAug tackles classifier bias by explicitly or implicitly augmenting the prototypes of old classes in the feature space, thereby imposing stronger constraints to preserve decision boundaries. We further enhance the framework with hardness-aware prototype augmentation and multi-view ensemble strategies, yielding significant performance gains. The proposed framework can be easily integrated with pre-trained models. Without storing any samples of old classes, our method performs comparably to state-of-the-art exemplar-based approaches that rely on extensive data storage. We hope to draw the attention of researchers back to non-exemplar CIL by rethinking the necessity of storing old samples.

Authors
Fei Zhu, Xu-yao Zhang, Zhen Cheng, Cheng-lin Liu