Improving Similar Language Translation With Transfer Learning

Ife Adebara, Muhammad Abdul-Mageed


Abstract
We investigate transfer learning based on pre-trained neural machine translation models to translate between (low-resource) similar languages. This work is part of our contribution to the WMT 2021 Similar Languages Translation Shared Task where we submitted models for different language pairs, including French-Bambara, Spanish-Catalan, and Spanish-Portuguese in both directions. Our models for Catalan-Spanish (82.79 BLEU)and Portuguese-Spanish (87.11 BLEU) rank top 1 in the official shared task evaluation, and we are the only team to submit models for the French-Bambara pairs.
Anthology ID:
2021.wmt-1.27
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
273–278
Language:
URL:
https://aclanthology.org/2021.wmt-1.27
DOI:
Bibkey:
Cite (ACL):
Ife Adebara and Muhammad Abdul-Mageed. 2021. Improving Similar Language Translation With Transfer Learning. In Proceedings of the Sixth Conference on Machine Translation, pages 273–278, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Similar Language Translation With Transfer Learning (Adebara & Abdul-Mageed, WMT 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.wmt-1.27.pdf