Exploring Vector Spaces for Semantic Relations

Kata Gábor, Haïfa Zargayouna, Isabelle Tellier, Davide Buscaldi, Thierry Charnois

[How to correct problems with metadata yourself]


Abstract
Word embeddings are used with success for a variety of tasks involving lexical semantic similarities between individual words. Using unsupervised methods and just cosine similarity, encouraging results were obtained for analogical similarities. In this paper, we explore the potential of pre-trained word embeddings to identify generic types of semantic relations in an unsupervised experiment. We propose a new relational similarity measure based on the combination of word2vec’s CBOW input and output vectors which outperforms concurrent vector representations, when used for unsupervised clustering on SemEval 2010 Relation Classification data.
Anthology ID:
D17-1193
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1814–1823
Language:
URL:
https://aclanthology.org/D17-1193
DOI:
10.18653/v1/D17-1193
Bibkey:
Cite (ACL):
Kata Gábor, Haïfa Zargayouna, Isabelle Tellier, Davide Buscaldi, and Thierry Charnois. 2017. Exploring Vector Spaces for Semantic Relations. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1814–1823, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Exploring Vector Spaces for Semantic Relations (Gábor et al., EMNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/D17-1193.pdf
Data
SemEval-2010 Task-8