Mixture of Languages: Improved Multilingual Encoders Through Language Grouping

João Maria Janeiro, Belen Alastruey, Francisco Massa, Maha Elbayad, Benjamin Piwowarski, Patrick Gallinari, Loic Barrault


Abstract
We propose Mixture of Languages (MoL), a new strategy to pretrain largely multilingual encoders. Recent work in this field has relied on training transformer encoders on a large amount of multilingual data, with all parameters shared across all languages, without studying how to optimally balance language transfer and interference to achieve better performance. To address this, MoL proposes to group languages based on their similarity, and add parallel, sparsely activated layers that process each group independently. This architecture allows MoL to boost language transfer while minimizing interference, without increasing the active parameter count. We show that MoL largely outperforms a dense counterpart trained with the same configuration, as well as MoE models and public multilingual encoders such as XLM-R or mBERT on downstream tasks.
Anthology ID:
2025.emnlp-main.1509
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29695–29710
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1509/
DOI:
Bibkey:
Cite (ACL):
João Maria Janeiro, Belen Alastruey, Francisco Massa, Maha Elbayad, Benjamin Piwowarski, Patrick Gallinari, and Loic Barrault. 2025. Mixture of Languages: Improved Multilingual Encoders Through Language Grouping. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 29695–29710, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Mixture of Languages: Improved Multilingual Encoders Through Language Grouping (Janeiro et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1509.pdf
Checklist:
 2025.emnlp-main.1509.checklist.pdf