Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help?

Fahimeh Saleh, Wray Buntine, Gholamreza Haffari, Lan Du


Abstract
Multilingual Neural Machine Translation (MNMT) trains a single NMT model that supports translation between multiple languages, rather than training separate models for different languages. Learning a single model can enhance the low-resource translation by leveraging data from multiple languages. However, the performance of an MNMT model is highly dependent on the type of languages used in training, as transferring knowledge from a diverse set of languages degrades the translation performance due to negative transfer. In this paper, we propose a Hierarchical Knowledge Distillation (HKD) approach for MNMT which capitalises on language groups generated according to typological features and phylogeny of languages to overcome the issue of negative transfer. HKD generates a set of multilingual teacher-assistant models via a selective knowledge distillation mechanism based on the language groups, and then distills the ultimate multilingual model from those assistants in an adaptive way. Experimental results derived from the TED dataset with 53 languages demonstrate the effectiveness of our approach in avoiding the negative transfer effect in MNMT, leading to an improved translation performance (about 1 BLEU score in average) compared to strong baselines.
Anthology ID:
2021.findings-emnlp.114
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1313–1330
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.114
DOI:
10.18653/v1/2021.findings-emnlp.114
Bibkey:
Cite (ACL):
Fahimeh Saleh, Wray Buntine, Gholamreza Haffari, and Lan Du. 2021. Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help?. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1313–1330, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Multilingual Neural Machine Translation: Can Linguistic Hierarchies Help? (Saleh et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2021.findings-emnlp.114.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2021.findings-emnlp.114.mp4