Denoising Word Embeddings by Averaging in a Shared Space

Avi Caciularu, Ido Dagan, Jacob Goldberger


Abstract
We introduce a new approach for smoothing and improving the quality of word embeddings. We consider a method of fusing word embeddings that were trained on the same corpus but with different initializations. We project all the models to a shared vector space using an efficient implementation of the Generalized Procrustes Analysis (GPA) procedure, previously used in multilingual word translation. Our word representation demonstrates consistent improvements over the raw models as well as their simplistic average, on a range of tasks. As the new representations are more stable and reliable, there is a noticeable improvement in rare word evaluations.
Anthology ID:
2021.starsem-1.28
Volume:
Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics
Month:
August
Year:
2021
Address:
Online
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
294–301
Language:
URL:
https://aclanthology.org/2021.starsem-1.28
DOI:
10.18653/v1/2021.starsem-1.28
Bibkey:
Cite (ACL):
Avi Caciularu, Ido Dagan, and Jacob Goldberger. 2021. Denoising Word Embeddings by Averaging in a Shared Space. In Proceedings of *SEM 2021: The Tenth Joint Conference on Lexical and Computational Semantics, pages 294–301, Online. Association for Computational Linguistics.
Cite (Informal):
Denoising Word Embeddings by Averaging in a Shared Space (Caciularu et al., *SEM 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.starsem-1.28.pdf