BFCP: Pursue Better Forward Compatibility Pretraining for Few-Shot Class-Incremental Learning.

Journal: IEEE Transactions On Neural Networks And Learning Systems
Published:
Abstract

Few-shot class-incremental learning (FSCIL) requires learning new knowledge without forgetting old knowledge. Forward compatibility can reserve space for novel classes while maintaining base class knowledge in incremental learning. Better forward compatibility is crucial for effectively mastering all knowledge, especially when dealing with a few unknown new classes. In this article, we propose the better forward compatibility pretraining (BFCP) to further enhance forward compatibility in FSCIL. We adopt a two-stage training for the backbone network in the base session. First, we train the backbone network at the image-level to enhance its feature extraction capability, enabling the model to extract valuable information from unknown class images. Second, we fine-tune the backbone network at the feature-level with fake prototypes and instances to achieve clustering base classes and reserve space for unknown new classes. For all incremental new sessions, we freeze the backbone network and employ prototype rectification without further training to refine the prototypes of the novel classes. We conduct extensive experiments with different input scales, including federated cross-domain pretraining and cross-domain class-incremental experiments. BFCP efficiently handles both novel and base classes of each incremental session and significantly outperforms state-of-the-art methods, achieving an average accuracy of 63.47% on the CIFAR100 dataset.

Authors
Zhiling Fu, Zhe Wang, Xinlei Xu, Wei Guo, Ziqiu Chi, Hai Yang, Wenli Du