Specializing Word Vectors by Spectral Decomposition on Heterogeneously Twisted Graphs

Yuanhang Ren, Ye Du


Abstract
Traditional word vectors, such as word2vec and glove, have a well-known inclination to conflate the semantic similarity with other semantic relations. A retrofitting procedure may be needed to solve this issue. In this work, we propose a new retrofitting method called Heterogeneously Retrofitted Spectral Word Embedding. It heterogeneously twists the similarity matrix of word pairs with lexical constraints. A new set of word vectors is generated by a spectral decomposition of the similarity matrix, which has a linear algebraic analytic form. Our method has a competitive performance compared with the state-of-the-art retrofitting method such as AR (CITATION). In addition, since our embedding has a clear linear algebraic relationship with the similarity matrix, we carefully study the contribution of each component in our model. Last but not least, our method is very efficient to execute.
Anthology ID:
2020.coling-main.321
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3599–3609
Language:
URL:
https://aclanthology.org/2020.coling-main.321
DOI:
10.18653/v1/2020.coling-main.321
Bibkey:
Cite (ACL):
Yuanhang Ren and Ye Du. 2020. Specializing Word Vectors by Spectral Decomposition on Heterogeneously Twisted Graphs. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3599–3609, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Specializing Word Vectors by Spectral Decomposition on Heterogeneously Twisted Graphs (Ren & Du, COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.321.pdf
Code
 ryh95/hrswe