Abstract
Semantic networks and semantic spaces have been two prominent approaches to represent lexical semantics. While a unified account of the lexical meaning relies on one being able to convert between these representations, in both directions, the conversion direction from semantic networks into semantic spaces started to attract more attention recently. In this paper we present a methodology for this conversion and assess it with a case study. When it is applied over WordNet, the performance of the resulting embeddings in a mainstream semantic similarity task is very good, substantially superior to the performance of word embeddings based on very large collections of texts like word2vec.- Anthology ID:
- W18-3016
- Volume:
- Proceedings of the Third Workshop on Representation Learning for NLP
- Month:
- July
- Year:
- 2018
- Address:
- Melbourne, Australia
- Editors:
- Isabelle Augenstein, Kris Cao, He He, Felix Hill, Spandana Gella, Jamie Kiros, Hongyuan Mei, Dipendra Misra
- Venue:
- RepL4NLP
- SIG:
- SIGREP
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 122–131
- Language:
- URL:
- https://aclanthology.org/W18-3016
- DOI:
- 10.18653/v1/W18-3016
- Cite (ACL):
- Chakaveh Saedi, António Branco, João António Rodrigues, and João Silva. 2018. WordNet Embeddings. In Proceedings of the Third Workshop on Representation Learning for NLP, pages 122–131, Melbourne, Australia. Association for Computational Linguistics.
- Cite (Informal):
- WordNet Embeddings (Saedi et al., RepL4NLP 2018)
- PDF:
- https://preview.aclanthology.org/ingest-bitext-workshop/W18-3016.pdf
- Code
- nlx-group/WordNetEmbeddings