On Multilingual Encoder Language Model Compression for Low-Resource Languages
Daniil Gurgurov, Michal Gregor, Josef Van Genabith, Simon Ostermann
Abstract
In this paper, we combine two-step knowledge distillation, structured pruning, truncation, and vocabulary trimming for extremely compressing multilingual encoder-only language models for low-resource languages. Our novel approach systematically combines existing techniques and takes them to the extreme, reducing layer depth, feed-forward hidden size, and intermediate layer embedding size to create significantly smaller monolingual models while retaining essential language-specific knowledge. We achieve compression rates of up to 92% while maintaining competitive performance, with average drops of 2–10% for moderate compression and 8–13% at maximum compression in four downstream tasks, including sentiment analysis, topic classification, named entity recognition, and part-of-speech tagging, across three low-resource languages. Notably, the performance degradation correlates with the amount of language-specific data in the teacher model, with larger datasets resulting in smaller performance losses. Additionally, we conduct ablation studies to identify the best practices for multilingual model compression using these techniques.- Anthology ID:
- 2025.ijcnlp-srw.5
- Volume:
- The 14th International Joint Conference on Natural Language Processing and The 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics
- Month:
- December
- Year:
- 2025
- Address:
- Mumbai, India
- Editors:
- Santosh T.y.s.s, Shuichiro Shimizu, Yifan Gong
- Venue:
- IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 47–58
- Language:
- URL:
- https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-srw.5/
- DOI:
- Cite (ACL):
- Daniil Gurgurov, Michal Gregor, Josef Van Genabith, and Simon Ostermann. 2025. On Multilingual Encoder Language Model Compression for Low-Resource Languages. In The 14th International Joint Conference on Natural Language Processing and The 4th Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics, pages 47–58, Mumbai, India. Association for Computational Linguistics.
- Cite (Informal):
- On Multilingual Encoder Language Model Compression for Low-Resource Languages (Gurgurov et al., IJCNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-ijcnlp-aacl/2025.ijcnlp-srw.5.pdf