@inproceedings{srijith-etal-2025-continual,
title = "Continual Learning in Large Language Models: Foundations to Frontiers",
author = "Srijith, P. K. and
Satapara, Shrey and
Chandar, Sarath",
editor = "Heinzerling, Benjamin and
Ku, Lun-Wei",
booktitle = "Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics: Tutorial Abstract",
month = dec,
year = "2025",
address = "Mumbai, India",
publisher = "Association for Computational Linguistics",
url = "https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-tutorials.2/",
pages = "6--17",
ISBN = "979-8-89176-302-9",
abstract = "Continual learning (CL) enables deep learning models to learn a sequence of tasks under resource constraint settings, without forgetting previously acquired knowledge. This is particularly useful for multilingual NLP for low-resource languages, where incremental data collection is common and the compute cost is crucial. This tutorial will introduce key CL methodologies and their applications in natural language processing (NLP), covering both foundational techniques and modern challenges posed by large language models (LLMs). This tutorial covers foundational CL strategies based on regularization, replay, and network architecture. We explore NLP-specific CL scenarios such as task-incremental, language-incremental, and joint task-language incremental setups, along with methodologies to address them. A major emphasize of the tutorial is on continual learning for large language models (LLMs), examining challenges in applying CL for LLMs and the benefits it can provide in LLM training and inference. We further explore the connection between several advances in LLM such as model merging and continual learning. This tutorial is suitable for NLP researchers, practitioners, and students interested in lifelong learning, multilingual NLP, or large language models. It is designed as a half-day tutorial at IJCNLP 2025 and fall under the category of Introduction to Non-CL/Non-NLP Topic."
}Markdown (Informal)
[Continual Learning in Large Language Models: Foundations to Frontiers](https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-tutorials.2/) (Srijith et al., IJCNLP 2025)
ACL
- P. K. Srijith, Shrey Satapara, and Sarath Chandar. 2025. Continual Learning in Large Language Models: Foundations to Frontiers. In Proceedings of the 14th International Joint Conference on Natural Language Processing and the 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics: Tutorial Abstract, pages 6–17, Mumbai, India. Association for Computational Linguistics.