Abstract
We investigate transfer learning based on pre-trained neural machine translation models to translate between (low-resource) similar languages. This work is part of our contribution to the WMT 2021 Similar Languages Translation Shared Task where we submitted models for different language pairs, including French-Bambara, Spanish-Catalan, and Spanish-Portuguese in both directions. Our models for Catalan-Spanish (82.79 BLEU)and Portuguese-Spanish (87.11 BLEU) rank top 1 in the official shared task evaluation, and we are the only team to submit models for the French-Bambara pairs.- Anthology ID:
- 2021.wmt-1.27
- Volume:
- Proceedings of the Sixth Conference on Machine Translation
- Month:
- November
- Year:
- 2021
- Address:
- Online
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 273–278
- Language:
- URL:
- https://aclanthology.org/2021.wmt-1.27
- DOI:
- Cite (ACL):
- Ife Adebara and Muhammad Abdul-Mageed. 2021. Improving Similar Language Translation With Transfer Learning. In Proceedings of the Sixth Conference on Machine Translation, pages 273–278, Online. Association for Computational Linguistics.
- Cite (Informal):
- Improving Similar Language Translation With Transfer Learning (Adebara & Abdul-Mageed, WMT 2021)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2021.wmt-1.27.pdf