From task-aware to task-agnostic parameter isolation for incremental learning

Alex Vicente-Sola, Paul Kirkland, Gaetano Di Caterina, Trevor J. Bihl, Marc Masana

Research output: Contribution to journalArticlepeer-review

Abstract

Mitigating catastrophic forgetting in continual learning is a long-standing challenge for artificial intelligence. Often, methods used to alleviate forgetting make use of either rehearsal buffers, pretrained backbones or task-id knowledge. However, these requirements result in severe limitations regarding scalability, privacy preservation, and efficient deployment. In this work, we explore how to eliminate the need for such requirements in incremental learning approaches based on parameter isolation. We propose Low Interference Feature Extraction Subnetworks (LIFES), a method that learns a subnetwork per task and uses all of them concurrently at inference time. This solution minimises requirements; however, it creates the need to address certain challenges. To formalize them, we break down the catastrophic forgetting problem into 4 distinct causes, and address them with a novel lateral classifiers regularization, weight standardization, and subnetwork interference connection pruning. Specifically, the use of lateral classification shows very promising results, forcing the model to learn distributions with higher inter-class distance. Using these components, LIFES achieves competitive results in standard task-agnostic scenarios, demonstrating the viability of this new perspective for parameter isolation, which has minimal requirements. Finally, we discuss how future work can improve this new paradigm further, and how the strategies defined can be complementary to other approaches.

Original languageEnglish
Article number81
JournalNeural Processing Letters
Volume57
Issue number5
DOIs
Publication statusPublished - Oct 2025
Externally publishedYes

Bibliographical note

Publisher Copyright:
© The Author(s) 2025.

Keywords

  • Catastrophic Forgetting
  • Continual Learning
  • Representational Overlap
  • Task-agnostic

Fingerprint

Dive into the research topics of 'From task-aware to task-agnostic parameter isolation for incremental learning'. Together they form a unique fingerprint.

Cite this