Massively Multilingual Lexical Specialization of Multilingual Transformers

Tommaso Green, Simone Paolo Ponzetto, Goran Glavaš


Abstract
While pretrained language models (PLMs) primarily serve as general-purpose text encoders that can be fine-tuned for a wide variety of downstream tasks, recent work has shown that they can also be rewired to produce high-quality word representations (i.e., static word embeddings) and yield good performance in type-level lexical tasks. While existing work primarily focused on the lexical specialization of monolingual PLMs with immense quantities of monolingual constraints, in this work we expose massively multilingual transformers (MMTs, e.g., mBERT or XLM-R) to multilingual lexical knowledge at scale, leveraging BabelNet as the readily available rich source of multilingual and cross-lingual type-level lexical knowledge. Concretely, we use BabelNet’s multilingual synsets to create synonym pairs (or synonym-gloss pairs) across 50 languages and then subject the MMTs (mBERT and XLM-R) to a lexical specialization procedure guided by a contrastive objective. We show that such massively multilingual lexical specialization brings substantial gains in two standard cross-lingual lexical tasks, bilingual lexicon induction and cross-lingual word similarity, as well as in cross-lingual sentence retrieval. Crucially, we observe gains for languages unseen in specialization, indicating that multilingual lexical specialization enables generalization to languages with no lexical constraints. In a series of subsequent controlled experiments, we show that the number of specialization constraints plays a much greater role than the set of languages from which they originate.
Anthology ID:
2023.acl-long.426
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7700–7715
Language:
URL:
https://aclanthology.org/2023.acl-long.426
DOI:
10.18653/v1/2023.acl-long.426
Bibkey:
Cite (ACL):
Tommaso Green, Simone Paolo Ponzetto, and Goran Glavaš. 2023. Massively Multilingual Lexical Specialization of Multilingual Transformers. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7700–7715, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Massively Multilingual Lexical Specialization of Multilingual Transformers (Green et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.426.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.426.mp4