Are All Good Word Vector Spaces Isomorphic?

Ivan Vulić, Sebastian Ruder, Anders Søgaard


Abstract
Existing algorithms for aligning cross-lingual word vector spaces assume that vector spaces are approximately isomorphic. As a result, they perform poorly or fail completely on non-isomorphic spaces. Such non-isomorphism has been hypothesised to result from typological differences between languages. In this work, we ask whether non-isomorphism is also crucially a sign of degenerate word vector spaces. We present a series of experiments across diverse languages which show that variance in performance across language pairs is not only due to typological differences, but can mostly be attributed to the size of the monolingual resources available, and to the properties and duration of monolingual training (e.g. “under-training”).
Anthology ID:
2020.emnlp-main.257
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3178–3192
Language:
URL:
https://aclanthology.org/2020.emnlp-main.257
DOI:
10.18653/v1/2020.emnlp-main.257
Bibkey:
Cite (ACL):
Ivan Vulić, Sebastian Ruder, and Anders Søgaard. 2020. Are All Good Word Vector Spaces Isomorphic?. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3178–3192, Online. Association for Computational Linguistics.
Cite (Informal):
Are All Good Word Vector Spaces Isomorphic? (Vulić et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2020.emnlp-main.257.pdf
Optional supplementary material:
 2020.emnlp-main.257.OptionalSupplementaryMaterial.zip
Video:
 https://slideslive.com/38939005
Code
 cambridgeltl/iso-study
Data
Panlex