Continual Neural Topic Model

Charu Karakkaparambil James, Waleed Mustafa, Marcio Monteiro, Marius Kloft, Sophie Fellenz


Abstract
In continual learning, our aim is to learn a new task without forgetting what was learned previously. In topic models, this translates to learning new topic models without forgetting previously learned topics. Previous work either considered Dynamic Topic Models (DTMs), which learn the evolution of topics based on the entire training corpus at once, or Online Topic Models, which are updated continuously based on new data but do not have long-term memory. To fill this gap, we propose the Continual Neural Topic Model (CoNTM), which continuously learns topic models at subsequent time steps without forgetting what was previously learned. This is achieved using a global prior distribution that is continuously updated. In our experiments, CoNTM consistently outperformed the dynamic topic model in terms of topic quality and predictive perplexity while being able to capture topic changes online. The analysis reveals that CoNTM can learn more diverse topics and better capture temporal changes than existing methods.
Anthology ID:
2026.eacl-long.312
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6636–6658
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.312/
DOI:
Bibkey:
Cite (ACL):
Charu Karakkaparambil James, Waleed Mustafa, Marcio Monteiro, Marius Kloft, and Sophie Fellenz. 2026. Continual Neural Topic Model. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6636–6658, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Continual Neural Topic Model (Karakkaparambil James et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.312.pdf