Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study

Menglong Cui, Pengzhi Gao, Wei Liu, Jian Luan, Bin Wang


Abstract
Large language models (LLMs) have shown continuously improving multilingual capabilities, and even small-scale open-source models have demonstrated rapid performance enhancement. In this paper, we systematically explore the abilities of open LLMs with less than ten billion parameters to handle multilingual machine translation (MT) tasks. We conduct comprehensive evaluations on six popular LLMs and find that models like Gemma2-9B exhibit impressive multilingual translation capabilities. We then introduce the Parallel-First Monolingual-Second (PFMS) data mixing strategy in the continual pretraining stage to further enhance the MT performance and present GemmaX2-28, a 9B model achieving top-tier multilingual translation performance across 28 languages. Specifically, GemmaX2-28 consistently outperforms the state-of-the-art (SOTA) models such as TowerInstruct and X-ALMA and achieves competitive performance with Google Translate and GPT-4-turbo.
Anthology ID:
2025.naacl-long.280
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5420–5443
Language:
URL:
https://preview.aclanthology.org/moar-dois/2025.naacl-long.280/
DOI:
10.18653/v1/2025.naacl-long.280
Bibkey:
Cite (ACL):
Menglong Cui, Pengzhi Gao, Wei Liu, Jian Luan, and Bin Wang. 2025. Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 5420–5443, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Multilingual Machine Translation with Open Large Language Models at Practical Scale: An Empirical Study (Cui et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/moar-dois/2025.naacl-long.280.pdf