Lo Jacqmin


2023

pdf
Mann” is to “Donna asis to Reine Adapting the Analogy Task for Multilingual and Contextual Embeddings
Timothee Mickus | Eduardo Cal | Lo Jacqmin | Denis Paperno | Mathieu Constant
Proceedings of the The 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023)

How does the word analogy task fit in the modern NLP landscape? Given the rarity of comparable multilingual benchmarks and the lack of a consensual evaluation protocol for contextual models, this remains an open question. In this paper, we introduce MATS: a multilingual analogy dataset, covering forty analogical relations in six languages, and evaluate human as well as static and contextual embedding performances on the task. We find that not all analogical relations are equally straightforward for humans, static models remain competitive with contextual embeddings, and optimal settings vary across languages and analogical relations. Several key challenges remain, including creating benchmarks that align with human reasoning and understanding what drives differences across methodologies.