What Causes Knowledge Loss in Multilingual Language Models?

Maria Khelli, Samuel Cahyawijaya, Ayu Purwarianti, Genta Indra Winata


Abstract
Cross-lingual transfer in natural language processing (NLP) models enhances multilingual performance by leveraging shared linguistic knowledge. However, traditional methods that process all data simultaneously often fail to mimic real-world scenarios, leading to challenges like catastrophic forgetting, where fine-tuning on new tasks degrades performance on previously learned ones. Our study explores this issue in multilingual contexts, focusing on linguistic differences affecting representational learning rather than just model parameters. We experiment with 52 languages using LoRA adapters of varying ranks to evaluate non-shared, partially shared, and fully shared parameters. Our aim is to see if parameter sharing through adapters can mitigate forgetting while preserving prior knowledge. We find that languages using non-Latin scripts are more susceptible to catastrophic forgetting, whereas those written in Latin script facilitate more effective cross-lingual transfer.
Anthology ID:
2025.fieldmatters-1.2
Volume:
Proceedings of the Fourth Workshop on NLP Applications to Field Linguistics
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Éric Le Ferrand, Elena Klyachko, Anna Postnikova, Tatiana Shavrina, Oleg Serikov, Ekaterina Voloshina, Ekaterina Vylomova
Venues:
FieldMatters | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15–25
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.fieldmatters-1.2/
DOI:
Bibkey:
Cite (ACL):
Maria Khelli, Samuel Cahyawijaya, Ayu Purwarianti, and Genta Indra Winata. 2025. What Causes Knowledge Loss in Multilingual Language Models?. In Proceedings of the Fourth Workshop on NLP Applications to Field Linguistics, pages 15–25, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
What Causes Knowledge Loss in Multilingual Language Models? (Khelli et al., FieldMatters 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.fieldmatters-1.2.pdf