In2X at WMT25 Translation Task

Lei Pang, Hanyi Mao, Quanjia Xiao, Chen Ruihan, Jingjun Zhang, Haixiao Liu, Xiangyi Li


Abstract
This paper presents the open-system submission by the In2x research team for the WMT25 General Machine Translation Shared Task. Our submission focuses on Japanese-related translation tasks, aiming to explore a generalizable paradigm for extending large language models (LLMs) to other languages. This paradigm encompasses aspects such as data construction methods and reward model design. The ultimate goal is to enable large language model systems to achieve exceptional performance in low-resource or less commonly spoken languages.
Anthology ID:
2025.wmt-1.43
Volume:
Proceedings of the Tenth Conference on Machine Translation
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
671–679
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.43/
DOI:
Bibkey:
Cite (ACL):
Lei Pang, Hanyi Mao, Quanjia Xiao, Chen Ruihan, Jingjun Zhang, Haixiao Liu, and Xiangyi Li. 2025. In2X at WMT25 Translation Task. In Proceedings of the Tenth Conference on Machine Translation, pages 671–679, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
In2X at WMT25 Translation Task (Pang et al., WMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.43.pdf