Learning Nearest Neighbour Informed Latent Word Embeddings to Improve Zero-Shot Machine Translation

Nishant Kambhatla, Logan Born, Anoop Sarkar


Abstract
Multilingual neural translation models exploit cross-lingual transfer to perform zero-shot translation between unseen language pairs. Past efforts to improve cross-lingual transfer have focused on aligning contextual sentence-level representations. This paper introduces three novel contributions to allow exploiting nearest neighbours at the token level during training, including: (i) an efficient, gradient-friendly way to share representations between neighboring tokens; (ii) an attentional semantic layer which extracts latent features from shared embeddings; and (iii) an agreement loss to harmonize predictions across different sentence representations. Experiments on two multilingual datasets demonstrate consistent gains in zero shot translation over strong baselines.
Anthology ID:
2023.iwslt-1.27
Volume:
Proceedings of the 20th International Conference on Spoken Language Translation (IWSLT 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada (in-person and online)
Editors:
Elizabeth Salesky, Marcello Federico, Marine Carpuat
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
Association for Computational Linguistics
Note:
Pages:
291–301
Language:
URL:
https://aclanthology.org/2023.iwslt-1.27
DOI:
10.18653/v1/2023.iwslt-1.27
Bibkey:
Cite (ACL):
Nishant Kambhatla, Logan Born, and Anoop Sarkar. 2023. Learning Nearest Neighbour Informed Latent Word Embeddings to Improve Zero-Shot Machine Translation. In Proceedings of the 20th International Conference on Spoken Language Translation (IWSLT 2023), pages 291–301, Toronto, Canada (in-person and online). Association for Computational Linguistics.
Cite (Informal):
Learning Nearest Neighbour Informed Latent Word Embeddings to Improve Zero-Shot Machine Translation (Kambhatla et al., IWSLT 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.iwslt-1.27.pdf