Together We Make Sense–Learning Meta-Sense Embeddings

Haochen Luo, Yi Zhou, Danushka Bollegala


Abstract
Sense embedding learning methods learn multiple vectors for a given ambiguous word, corresponding to its different word senses. For this purpose, different methods have been proposed in prior work on sense embedding learning that use different sense inventories, sense-tagged corpora and learning methods. However, not all existing sense embeddings cover all senses of ambiguous words equally well due to the discrepancies in their training resources. To address this problem, we propose the first-ever meta-sense embedding method – Neighbour Preserving Meta-Sense Embeddings, which learns meta-sense embeddings by combining multiple independently trained source sense embeddings such that the sense neighbourhoods computed from the source embeddings are preserved in the meta-embedding space. Our proposed method can combine source sense embeddings that cover different sets of word senses. Experimental results on Word Sense Disambiguation (WSD) and Word-in-Context (WiC) tasks show that the proposed meta-sense embedding method consistently outperforms several competitive baselines. An anonymised version of the source code implementation for our proposed method is submitted to reviewing system. Both source code and the learnt meta-sense embeddings will be publicly released upon paper acceptance.
Anthology ID:
2023.findings-acl.165
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2638–2651
Language:
URL:
https://aclanthology.org/2023.findings-acl.165
DOI:
10.18653/v1/2023.findings-acl.165
Bibkey:
Cite (ACL):
Haochen Luo, Yi Zhou, and Danushka Bollegala. 2023. Together We Make Sense–Learning Meta-Sense Embeddings. In Findings of the Association for Computational Linguistics: ACL 2023, pages 2638–2651, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Together We Make Sense–Learning Meta-Sense Embeddings (Luo et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.165.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.165.mp4