Improving Similar Language Translation With Transfer Learning

Ife Adebara, Muhammad Abdul-Mageed


Abstract
We investigate transfer learning based on pre-trained neural machine translation models to translate between (low-resource) similar languages. This work is part of our contribution to the WMT 2021 Similar Languages Translation Shared Task where we submitted models for different language pairs, including French-Bambara, Spanish-Catalan, and Spanish-Portuguese in both directions. Our models for Catalan-Spanish (82.79 BLEU)and Portuguese-Spanish (87.11 BLEU) rank top 1 in the official shared task evaluation, and we are the only team to submit models for the French-Bambara pairs.
Anthology ID:
2021.wmt-1.27
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Venues:
EMNLP | WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
273–278
Language:
URL:
https://aclanthology.org/2021.wmt-1.27
DOI:
Bibkey:
Cite (ACL):
Ife Adebara and Muhammad Abdul-Mageed. 2021. Improving Similar Language Translation With Transfer Learning. In Proceedings of the Sixth Conference on Machine Translation, pages 273–278, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Similar Language Translation With Transfer Learning (Adebara & Abdul-Mageed, WMT 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.wmt-1.27.pdf