Miroslav Hrabal
2025
CUNI at WMT25 General Translation Task
Josef Jon
|
Miroslav Hrabal
|
Martin Popel
|
Ondřej Bojar
Proceedings of the Tenth Conference on Machine Translation
This paper describes the CUNI submissions to the WMT25 General Translation task, namely for the English to Czech, English to Serbian, Czech to German and Czech to Ukrainian language pairs. We worked in multiple teams, each with a different approach, spanning from traditional, smaller Transformer NMT models trained on both sentence and document level, to fine-tuning LLMs using LoRA and CPO. We show that these methods are effective in improving automatic MT evaluation scores compared to the base pretrained models.
CUNI and Phrase at WMT25 MT Evaluation Task
Miroslav Hrabal
|
Ondrej Glembek
|
Aleš Tamchyna
|
Almut Silja Hildebrand
|
Alan Eckhard
|
Miroslav Štola
|
Sergio Penkale
|
Zuzana Šimečková
|
Ondřej Bojar
|
Alon Lavie
|
Craig Stewart
Proceedings of the Tenth Conference on Machine Translation
This paper describes the joint effort of Phrase a.s. and Charles University’sInstitute of Formal and Applied Linguistics (CUNI/UFAL) on the WMT25Automated Translation Quality Evaluation Systems Shared Task. Both teamsparticipated both in a collaborative and competitive manner, i.e. they eachsubmitted a system of their own as well as a contrastive joint system ensemble.In Task~1, we show that such an ensembling—if chosen in a clever way—canlead to a performance boost. We present the analysis of various kinds ofsystems comprising both “traditional” NN-based approach, as well as differentflavours of LLMs—off-the-shelf commercial models, their fine-tuned versions,but also in-house, custom-trained alternative models. In Tasks~2 and~3 we showPhrase’s approach to tackling the tasks via various GPT models: Error SpanAnnotation via the complete MQM solution using non-reasoning models (includingfine-tuned versions) in Task~2, and using reasoning models in Task~3.
2024
CUNI at WMT24 General Translation Task: LLMs, (Q)LoRA, CPO and Model Merging
Miroslav Hrabal
|
Josef Jon
|
Martin Popel
|
Nam Luu
|
Danil Semin
|
Ondřej Bojar
Proceedings of the Ninth Conference on Machine Translation
This paper presents the contributions of Charles University teams to the WMT24 General Translation task (English to Czech, German and Russian, and Czech to Ukrainian), and the WMT24 Translation into Low-Resource Languages of Spain task.Our most elaborate submission, CUNI-MH for en2cs, is the result of fine-tuning Mistral 7B v0.1 for translation using a three-stage process: Supervised fine-tuning using QLoRA, Contrastive Preference Optimization, and merging of model checkpoints. We also describe the CUNI-GA, CUNI-Transformer and CUNI-DocTransformer submissions, which are based on our systems from the previous year.Our en2ru system CUNI-DS uses a similar first stage as CUNI-MH (QLoRA for en2cs) and follows with transferring to en2ru.For en2de (CUNI-NL), we experimented with a LLM-based speech translation system, to translate without the speech input.For the Translation into Low-Resource Languages of Spain task, we performed QLoRA fine-tuning of a large LLM on a small amount of synthetic (backtranslated) data.
Search
Fix author
Co-authors
- Ondřej Bojar 3
- Josef Jon 2
- Martin Popel 2
- Alan Eckhard 1
- Ondrej Glembek 1
- show all...