
Brain Informatics, Journal Year: 2025, Volume and Issue: 12(1)
Published: March 17, 2025
Abstract Class incremental learning (CIL) is a specific scenario in learning. It aims to continuously learn new classes from the data stream, which suffers challenge of catastrophic forgetting. Inspired by human hippocampus, CIL method for replaying episodic memory offers promising solution. However, limited buffer budget restricts number old class samples that can be stored, resulting an imbalance between and during each stage. This adversely affects mitigation Therefore, we propose novel based on multi-granularity balance strategy (MGBCIL), inspired three-way granular computing problem-solving. In order mitigate adverse effects imbalances forgetting at fine-, medium-, coarse-grained levels training, MGBCIL introduces strategies across batch, task, decision stages. Specifically, weighted cross-entropy loss function with smoothing factor proposed batch processing. process task updating classification decision, contrastive different anchor point settings employed promote local global separation classes. Additionally, knowledge distillation technology used preserve Experimental evaluations CIFAR-10 CIFAR-100 datasets show outperforms other methods most settings. when storing 3 exemplars Base2 Inc2 setting, average accuracy improved up 9.59% rate reduced 25.45%.
Language: Английский