Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs
Amr Hendy, Esraa A. Gad, Mohamed Abdelghaffar, Jailan S. ElMosalami, Mohamed Afify, Ahmed Y. Tawfik, Hany Hassan Awadalla
Abstract
This paper describes the Microsoft Egypt Development Center (EgDC) submission to the constrained track of WMT21 shared news translation task. We focus on the three relatively low resource language pairs Bengali ↔ Hindi, English ↔ Hausa and Xhosa ↔ Zulu. To overcome the limitation of relatively low parallel data we train a multilingual model using a multitask objective employing both parallel and monolingual data. In addition, we augment the data using back translation. We also train a bilingual model incorporating back translation and knowledge distillation then combine the two models using sequence-to-sequence mapping. We see around 70% relative gain in BLEU point for En ↔ Ha and around 25% relative improvements for Bn ↔ Hi and Xh ↔ Zu compared to bilingual baselines.- Anthology ID:
- 2021.wmt-1.8
- Volume:
- Proceedings of the Sixth Conference on Machine Translation
- Month:
- November
- Year:
- 2021
- Address:
- Online
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 130–135
- Language:
- URL:
- https://aclanthology.org/2021.wmt-1.8
- DOI:
- Cite (ACL):
- Amr Hendy, Esraa A. Gad, Mohamed Abdelghaffar, Jailan S. ElMosalami, Mohamed Afify, Ahmed Y. Tawfik, and Hany Hassan Awadalla. 2021. Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs. In Proceedings of the Sixth Conference on Machine Translation, pages 130–135, Online. Association for Computational Linguistics.
- Cite (Informal):
- Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs (Hendy et al., WMT 2021)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2021.wmt-1.8.pdf