Viewing Knowledge Transfer in Multilingual Machine Translation Through a Representational Lens

David Stap, Vlad Niculae, Christof Monz


Abstract
We argue that translation quality alone is not a sufficient metric for measuring knowledge transfer in multilingual neural machine translation. To support this claim, we introduce Representational Transfer Potential (RTP), which measures representational similarities between languages. We show that RTP can measure both positive and negative transfer (interference), and find that RTP is strongly correlated with changes in translation quality, indicating that transfer does occur. Furthermore, we investigate data and language characteristics that are relevant for transfer, and find that multi-parallel overlap is an important yet under-explored feature. Based on this, we develop a novel training scheme, which uses an auxiliary similarity loss that encourages representations to be more invariant across languages by taking advantage of multi-parallel data. We show that our method yields increased translation quality for low- and mid-resource languages across multiple data and model setups.
Anthology ID:
2023.findings-emnlp.998
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14973–14987
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.998
DOI:
10.18653/v1/2023.findings-emnlp.998
Bibkey:
Cite (ACL):
David Stap, Vlad Niculae, and Christof Monz. 2023. Viewing Knowledge Transfer in Multilingual Machine Translation Through a Representational Lens. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14973–14987, Singapore. Association for Computational Linguistics.
Cite (Informal):
Viewing Knowledge Transfer in Multilingual Machine Translation Through a Representational Lens (Stap et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.findings-emnlp.998.pdf