logo
today local_bar
Poster Session 2 · Wednesday, December 3, 2025 4:30 PM → 7:30 PM
#4019

Self-Evolving Pseudo-Rehearsal for Catastrophic Forgetting with Task Similarity in LLMs

NeurIPS OpenReview Code

Abstract

Continual learning for large language models (LLMs) demands a precise balance between plasticity - the ability to absorb new tasks - and stability - the preservation of previously learned knowledge. Conventional rehearsal methods, which replay stored examples, are limited by long-term data inaccessibility; earlier pseudo-rehearsal methods require additional generation modules, while self-synthesis approaches often generate samples that poorly align with real tasks, suffer from unstable outputs, and ignore task relationships.
We present Self-Evolving Pseudo-Rehearsal for Catastrophic Forgetting with Task Similarity(SERS), a lightweight framework that
  1. decouples pseudo-input synthesis from label creation, using semantic masking and template guidance to produce diverse, task-relevant prompts without extra modules;
  2. applies label self-evolution, blending base-model priors with fine-tuned outputs to prevent over-specialization; and
  3. introduces a dynamic regularizer driven by the Wasserstein distance between task distributions, automatically relaxing or strengthening constraints in proportion to task similarity.
Experiments across diverse tasks on different LLMs show that our SERS reduces forgetting by over 2% points against strong pseudo-rehearsal baselines, by ensuring efficient data utilization and wisely transferring knowledge. The code will be released at https://github.com/JerryWangJun/LLMCLSERS/.