Maali Tars


2021

pdf bib
Extremely low-resource machine translation for closely related languages
Maali Tars | Andre Tättar | Mark Fišel
Proceedings of the 23rd Nordic Conference on Computational Linguistics (NoDaLiDa)

An effective method to improve extremely low-resource neural machine translation is multilingual training, which can be improved by leveraging monolingual data to create synthetic bilingual corpora using the back-translation method. This work focuses on closely related languages from the Uralic language family: from Estonian and Finnish geographical regions. We find that multilingual learning and synthetic corpora increase the translation quality in every language pair for which we have data. We show that transfer learning and fine-tuning are very effective for doing low-resource machine translation and achieve the best results. We collected new parallel data for Võro, North and South Saami and present first results of neural machine translation for these languages.