L4: Mutual Learning Helps Lifelong Language Learning

Jiyong Li, Dilshod Azizov, Shangsong Liang


Abstract
Adapting language models to learn continuously from data streams while retaining previous knowledge is a key challenge in artificial intelligence (AI), particularly in lifelong language learning. Existing distillation methods are based on offline techniques, limiting their ability to update in real-time and adapt to dynamic environments. To address this, we propose online dynamic mutual distillation - a novel framework that enables continuous mutual learning from task streams without relying on domain-specific teachers. To our knowledge, this is the first application of mutual learning in lifelong language learning, providing dynamic knowledge transfer without domain-specific teachers. Moreover, our extensive experiments demonstrate that the proposed method reduces catastrophic forgetting, while improving task performance on various benchmark datasets making it suitable for real-world, dynamic natural language processing (NLP) applications such as adaptive chatbots and personalized language systems. We will make our code publicly available upon acceptance.
Anthology ID:
2025.emnlp-industry.89
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1275–1286
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.89/
DOI:
Bibkey:
Cite (ACL):
Jiyong Li, Dilshod Azizov, and Shangsong Liang. 2025. L4: Mutual Learning Helps Lifelong Language Learning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1275–1286, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
L4: Mutual Learning Helps Lifelong Language Learning (Li et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.89.pdf