Abstract
This paper describes a hypernym discovery system for our participation in the SemEval-2018 Task 9, which aims to discover the best (set of) candidate hypernyms for input concepts or entities, given the search space of a pre-defined vocabulary. We introduce a neural network architecture for the concerned task and empirically study various neural network models to build the representations in latent space for words and phrases. The evaluated models include convolutional neural network, long-short term memory network, gated recurrent unit and recurrent convolutional neural network. We also explore different embedding methods, including word embedding and sense embedding for better performance.- Anthology ID:
- S18-1147
- Volume:
- Proceedings of the 12th International Workshop on Semantic Evaluation
- Month:
- June
- Year:
- 2018
- Address:
- New Orleans, Louisiana
- Editors:
- Marianna Apidianaki, Saif M. Mohammad, Jonathan May, Ekaterina Shutova, Steven Bethard, Marine Carpuat
- Venue:
- SemEval
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 903–908
- Language:
- URL:
- https://aclanthology.org/S18-1147
- DOI:
- 10.18653/v1/S18-1147
- Cite (ACL):
- Zhuosheng Zhang, Jiangtong Li, Hai Zhao, and Bingjie Tang. 2018. SJTU-NLP at SemEval-2018 Task 9: Neural Hypernym Discovery with Term Embeddings. In Proceedings of the 12th International Workshop on Semantic Evaluation, pages 903–908, New Orleans, Louisiana. Association for Computational Linguistics.
- Cite (Informal):
- SJTU-NLP at SemEval-2018 Task 9: Neural Hypernym Discovery with Term Embeddings (Zhang et al., SemEval 2018)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/S18-1147.pdf
- Data
- SemEval-2018 Task-9