Combining Static and Contextualised Multilingual Embeddings

Katharina Hämmerl, Jindřich Libovický, Alexander Fraser


Abstract
Static and contextual multilingual embeddings have complementary strengths. Static embeddings, while less expressive than contextual language models, can be more straightforwardly aligned across multiple languages. We combine the strengths of static and contextual models to improve multilingual representations. We extract static embeddings for 40 languages from XLM-R, validate those embeddings with cross-lingual word retrieval, and then align them using VecMap. This results in high-quality, highly multilingual static embeddings. Then we apply a novel continued pre-training approach to XLM-R, leveraging the high quality alignment of our static embeddings to better align the representation space of XLM-R. We show positive results for multiple complex semantic tasks. We release the static embeddings and the continued pre-training code. Unlike most previous work, our continued pre-training approach does not require parallel text.
Anthology ID:
2022.findings-acl.182
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2316–2329
Language:
URL:
https://aclanthology.org/2022.findings-acl.182
DOI:
10.18653/v1/2022.findings-acl.182
Bibkey:
Cite (ACL):
Katharina Hämmerl, Jindřich Libovický, and Alexander Fraser. 2022. Combining Static and Contextualised Multilingual Embeddings. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2316–2329, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Combining Static and Contextualised Multilingual Embeddings (Hämmerl et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2022.findings-acl.182.pdf
Software:
 2022.findings-acl.182.software.zip
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2022.findings-acl.182.mp4
Code
 kathyhaem/combining-static-contextual
Data
TyDiQAXQuAD