Why is unsupervised alignment of English embeddings from different algorithms so hard?

Mareike Hartmann, Yova Kementchedjhieva, Anders Søgaard


Abstract
This paper presents a challenge to the community: Generative adversarial networks (GANs) can perfectly align independent English word embeddings induced using the same algorithm, based on distributional information alone; but fails to do so, for two different embeddings algorithms. Why is that? We believe understanding why, is key to understand both modern word embedding algorithms and the limitations and instability dynamics of GANs. This paper shows that (a) in all these cases, where alignment fails, there exists a linear transform between the two embeddings (so algorithm biases do not lead to non-linear differences), and (b) similar effects can not easily be obtained by varying hyper-parameters. One plausible suggestion based on our initial experiments is that the differences in the inductive biases of the embedding algorithms lead to an optimization landscape that is riddled with local optima, leading to a very small basin of convergence, but we present this more as a challenge paper than a technical contribution.
Anthology ID:
D18-1056
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
582–586
Language:
URL:
https://aclanthology.org/D18-1056
DOI:
10.18653/v1/D18-1056
Bibkey:
Cite (ACL):
Mareike Hartmann, Yova Kementchedjhieva, and Anders Søgaard. 2018. Why is unsupervised alignment of English embeddings from different algorithms so hard?. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 582–586, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Why is unsupervised alignment of English embeddings from different algorithms so hard? (Hartmann et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/D18-1056.pdf
Video:
 https://vimeo.com/305196498