Schedule-Robust Continual Learning
Author(s) -
Ruohan Wang,
Marco Ciccone,
Massimiliano Pontil,
Carlo Ciliberto
Publication year - 2025
Publication title -
ieee transactions on pattern analysis and machine intelligence
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 3.811
H-Index - 372
eISSN - 1939-3539
pISSN - 0162-8828
DOI - 10.1109/tpami.2025.3614868
Subject(s) - computing and processing , bioengineering
Continual learning (CL) tackles a fundamental challenge in machine learning, aiming to continuously learn novel data from non-stationary data streams while mitigating forgetting of previously learned data. Although existing CL algorithms have introduced various practical techniques for combating forgetting, little attention has been devoted to studying how data schedules – which dictate how the sample distribution of a data stream evolves over time – affect the CL problem. Empirically, most CL methods are susceptible to schedule changes: they exhibit markedly lower accuracy when dealing with more “difficult schedules over the same underlying training data. In practical scenarios, data schedules are often unknown and a key challenge is thus to design CL methods that are robust to diverse schedules to ensure model reliability. In this work, we introduce the novel concept of schedule robustness for CL and propose Schedule-Robust Continual Learning (SCROLL), a strong baseline satisfying this desirable property. SCROLL trains a linear classifier on a suitably pre-trained representation, followed by model adaptation using replay data only. We connect SCROLL to a meta-learning formulation of CL with provable guarantees on schedule robustness. Empirically, the proposed method significantly outperforms existing CL methods and we provide extensive ablations to highlight its properties.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom