TartuNLP at WMT25 LLMs with Limited Resources for Slavic Languages Shared Task

Taido Purason, Mark Fishel


Abstract
This paper describes the TartuNLP submission to the Upper Sorbian (hsb) and Lower Sorbian (dsb) tracks of the WMT25 LLMs with Limited Resources for Slavic Languages shared task, which jointly targets machine translation (MT) and question answering (QA). We develop a single multilingual model based on Qwen2.5-3B-Instruct by continuing pretraining on Sorbian monolingual and parallel data together with general instruction datasets, combining language acquisition and instruction-following in a single step. The resulting model delivers substantial improvements over the baseline Qwen2.5-3B-Instruct model and also achieves the highest ranking for both tasks in the hsb and dsb shared task tracks.
Anthology ID:
2025.wmt-1.88
Volume:
Proceedings of the Tenth Conference on Machine Translation
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1143–1150
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.88/
DOI:
Bibkey:
Cite (ACL):
Taido Purason and Mark Fishel. 2025. TartuNLP at WMT25 LLMs with Limited Resources for Slavic Languages Shared Task. In Proceedings of the Tenth Conference on Machine Translation, pages 1143–1150, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
TartuNLP at WMT25 LLMs with Limited Resources for Slavic Languages Shared Task (Purason & Fishel, WMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.88.pdf