Continual learning via probabilistic exchangeable sequence modelling
Published in Preprint on arXiv, 2025
Continual learning (CL) refers to the ability to continuously learn and accumulate new knowl- edge while retaining useful information from past experiences. Although numerous CL methods have been proposed in recent years, it is not straightforward to deploy them directly to real- world decision-making problems due to their com- putational cost and lack of uncertainty quantifi- cation. To address these issues, we propose CL- BRUNO, a probabilistic, Neural Process-based CL model that performs scalable and tractable Bayesian update and prediction. Our proposed approach uses deep-generative models to create a unified probabilistic framework capable of han- dling different types of CL problems such as task- and class-incremental learning, allowing users to integrate information across different CL scenar- ios using a single model. Our approach is able to prevent catastrophic forgetting through distri- butional and functional regularisation without the need of retaining any previously seen samples, making it appealing to applications where data privacy or storage capacity is of concern. Experi- ments show that CL-BRUNO outperforms exist- ing methods on both natural image and biomedi- cal data sets, confirming its effectiveness in real- world applications.
Download here