Jingjun Zhang


2025

pdf bib
In2X at WMT25 Translation Task
Lei Pang | Hanyi Mao | Quanjia Xiao | Chen Ruihan | Jingjun Zhang | Haixiao Liu | Xiangyi Li
Proceedings of the Tenth Conference on Machine Translation

This paper presents the open-system submission by the In2x research team for the WMT25 General Machine Translation Shared Task. Our submission focuses on Japanese-related translation tasks, aiming to explore a generalizable paradigm for extending large language models (LLMs) to other languages. This paradigm encompasses aspects such as data construction methods and reward model design. The ultimate goal is to enable large language model systems to achieve exceptional performance in low-resource or less commonly spoken languages.