Abstract
This paper compares two approaches to word sense disambiguation using word embeddings trained on unambiguous synonyms. The first is unsupervised method based on computing log probability from sequences of word embedding vectors, taking into account ambiguous word senses and guessing correct sense from context. The second method is supervised. We use a multilayer neural network model to learn a context-sensitive transformation that maps an input vector of ambiguous word into an output vector representing its sense. We evaluate both methods on corpora with manual annotations of word senses from the Polish wordnet (plWordnet).- Anthology ID:
- W17-1915
- Volume:
- Proceedings of the 1st Workshop on Sense, Concept and Entity Representations and their Applications
- Month:
- April
- Year:
- 2017
- Address:
- Valencia, Spain
- Editors:
- Jose Camacho-Collados, Mohammad Taher Pilehvar
- Venue:
- SENSE
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 120–125
- Language:
- URL:
- https://aclanthology.org/W17-1915
- DOI:
- 10.18653/v1/W17-1915
- Cite (ACL):
- Aleksander Wawer and Agnieszka Mykowiecka. 2017. Supervised and Unsupervised Word Sense Disambiguation on Word Embedding Vectors of Unambigous Synonyms. In Proceedings of the 1st Workshop on Sense, Concept and Entity Representations and their Applications, pages 120–125, Valencia, Spain. Association for Computational Linguistics.
- Cite (Informal):
- Supervised and Unsupervised Word Sense Disambiguation on Word Embedding Vectors of Unambigous Synonyms (Wawer & Mykowiecka, SENSE 2017)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/W17-1915.pdf