Shy-hunyuan-MT at WMT25 General Machine Translation Shared Task

Mao Zheng, Zheng Li, Yang Du, Bingxin Qu, Mingyang Song


Abstract
In this paper, we present our submission to the WMT25 shared task on machine translation, for which we propose Synergy-enhanced policy optimization framework, named Shy. This novel two-phase training framework synergistically combines knowledge distillation and fusion via reinforcement learning.In the first phase, we introduce a multi-stage training framework that harnesses the complementary strengths of multiple state-of-the-art large language models to generate diverse, high-quality translation candidates. These candidates serve as pseudo-references to guide the supervised fine-tuning of our model, Hunyuan-7B, effectively distilling the collective knowledge of multiple expert systems into a single efficient model.In the second phase, we further refine the distilled model through Group Relative Policy Optimization, a reinforcement learning technique that employs a composite reward function. By calculating reward from multiple perspectives, our model ensures better alignment with human preferences and evaluation metrics.Extensive experiments across multiple language pairs demonstrate that our model Shy-hunyuan-MT yields substantial improvements in translation quality compared to baseline approaches. Notably, our framework achieves competitive performance comparable to that of state-of-the-art systems while maintaining computational efficiency through knowledge distillation and strategic ensemble.
Anthology ID:
2025.wmt-1.36
Volume:
Proceedings of the Tenth Conference on Machine Translation
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
607–613
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.36/
DOI:
Bibkey:
Cite (ACL):
Mao Zheng, Zheng Li, Yang Du, Bingxin Qu, and Mingyang Song. 2025. Shy-hunyuan-MT at WMT25 General Machine Translation Shared Task. In Proceedings of the Tenth Conference on Machine Translation, pages 607–613, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Shy-hunyuan-MT at WMT25 General Machine Translation Shared Task (Zheng et al., WMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.36.pdf