INTEGRATING PRESENT AND PAST IN UNSUPERVISED CONTINUAL LEARNING

Yipeng Zhang, Laurent Charlin, Richard Zemel, Mengye Ren

Research output: Contribution to journalConference articlepeer-review

Abstract

We formulate a unifying framework for unsupervised continual learning (UCL), which disentangles learning objectives that are specific to the present and the past data, encompassing stability, plasticity, and cross-task consolidation. The framework reveals that many existing UCL approaches overlook cross-task consolidation and try to balance plasticity and stability in a shared embedding space. This results in worse performance due to a lack of within-task data diversity and reduced effectiveness in learning the current task. Our method, Osiris, which explicitly optimizes all three objectives on separate embedding spaces, achieves state-of-the-art performance on all benchmarks, including two novel ones proposed in this paper featuring semantically structured task sequences. Finally, we show some preliminary evidence that continual models can benefit from such more realistic learning scenarios.

Original languageEnglish (US)
Pages (from-to)388-409
Number of pages22
JournalProceedings of Machine Learning Research
Volume274
StatePublished - 2024
Event3rd Conference on Lifelong Learning Agents, CoLLAs 2024 - Pisa, Italy
Duration: Jul 29 2024Aug 1 2024

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'INTEGRATING PRESENT AND PAST IN UNSUPERVISED CONTINUAL LEARNING'. Together they form a unique fingerprint.

Cite this