Improving Word Translation via Two-Stage Contrastive Learning

Yaoyiran Li, Fangyu Liu, Nigel Collier, Anna Korhonen, Ivan Vulić


Abstract
Word translation or bilingual lexicon induction (BLI) is a key cross-lingual task, aiming to bridge the lexical gap between different languages. In this work, we propose a robust and effective two-stage contrastive learning framework for the BLI task. At Stage C1, we propose to refine standard cross-lingual linear maps between static word embeddings (WEs) via a contrastive learning objective; we also show how to integrate it into the self-learning procedure for even more refined cross-lingual maps. In Stage C2, we conduct BLI-oriented contrastive fine-tuning of mBERT, unlocking its word translation capability. We also show that static WEs induced from the ‘C2-tuned’ mBERT complement static WEs from Stage C1. Comprehensive experiments on standard BLI datasets for diverse languages and different experimental setups demonstrate substantial gains achieved by our framework. While the BLI method from Stage C1 already yields substantial gains over all state-of-the-art BLI methods in our comparison, even stronger improvements are met with the full two-stage framework: e.g., we report gains for 112/112 BLI setups, spanning 28 language pairs.
Anthology ID:
2022.acl-long.299
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4353–4374
Language:
URL:
https://aclanthology.org/2022.acl-long.299
DOI:
10.18653/v1/2022.acl-long.299
Bibkey:
Cite (ACL):
Yaoyiran Li, Fangyu Liu, Nigel Collier, Anna Korhonen, and Ivan Vulić. 2022. Improving Word Translation via Two-Stage Contrastive Learning. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4353–4374, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Improving Word Translation via Two-Stage Contrastive Learning (Li et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.acl-long.299.pdf
Software:
 2022.acl-long.299.software.zip
Video:
 https://preview.aclanthology.org/remove-xml-comments/2022.acl-long.299.mp4
Code
 cambridgeltl/contrastivebli
Data
PanLex-BLIXLING