DLUT and GTCOM’s Large Language Model Based Translation System for WMT25

Hao Zong, Chao Bei, Wentao Chen, Conghu Yuan, Huan Liu, Degen Huang


Abstract
This paper presents the submission from Dalian University of Technology (DLUT) and Global Tone Communication Technology Co., Ltd. (GTCOM) to the WMT25 General Machine Translation Task. Amidst the paradigm shift from specialized encoder-decoder models to general-purpose Large Language Models (LLMs), this work conducts a systematic comparison of both approaches across five language pairs. For traditional Neural Machine Translation (NMT), we build strong baselines using deep Transformer architectures enhanced with data augmentation. For the LLM paradigm, we explore zero-shot performance and two distinct supervised fine-tuning (SFT) strategies: direct translation and translation refinement. Our key findings reveal a significant discrepancy between lexical and semantic evaluation metrics: while strong NMT systems remain competitive in BLEU scores, fine-tuned LLMs demonstrate marked superiority in semantic fidelity as measured by COMET. Furthermore, we find that fine-tuning LLMs for direct translation is more effective than for refinement, suggesting that teaching the core task directly is preferable to correcting baseline outputs.
Anthology ID:
2025.wmt-1.49
Volume:
Proceedings of the Tenth Conference on Machine Translation
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
732–739
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.49/
DOI:
Bibkey:
Cite (ACL):
Hao Zong, Chao Bei, Wentao Chen, Conghu Yuan, Huan Liu, and Degen Huang. 2025. DLUT and GTCOM’s Large Language Model Based Translation System for WMT25. In Proceedings of the Tenth Conference on Machine Translation, pages 732–739, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
DLUT and GTCOM’s Large Language Model Based Translation System for WMT25 (Zong et al., WMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.49.pdf