Yara Khater
2025
SYSTRAN @ WMT 2025 General Translation Task
Dakun Zhang
|
Yara Khater
|
Ramzi Rahli
|
Anna Rebollo
|
Josep Crego
Proceedings of the Tenth Conference on Machine Translation
We present an English-to-Japanese translationsystem built upon the EuroLLM-9B (Martinset al., 2025) model. The training process involvestwo main stages: continue pretraining(CPT) and supervised fine-tuning (SFT). Afterboth stages, we further tuned the model using adevelopment set to optimize performance. Fortraining data, we employed both basic filteringtechniques and high-quality filtering strategiesto ensure data cleanness. Additionally, we classifyboth the training data and development datainto four different domains and we train andfine-tune with domain specific prompts duringsystem training. Finally, we applied MinimumBayes Risk (MBR) decoding and paragraph-levelreranking for post-processing to enhancetranslation quality.