Abstract
Word embedding algorithms have become a common tool in the field of natural language processing. While some, like Word2Vec, are based on sequential text input, others are utilizing a graph representation of text. In this paper, we introduce a new algorithm, named WordGraph2Vec, or in short WG2V, which combines the two approaches to gain the benefits of both. The algorithm uses a directed word graph to provide additional information for sequential text input algorithms. Our experiments on benchmark datasets show that text classification algorithms are nearly as accurate with WG2V as with other word embedding models while preserving more stable accuracy rankings.- Anthology ID:
- D19-5305
- Volume:
- Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong
- Venue:
- TextGraphs
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 32–41
- Language:
- URL:
- https://aclanthology.org/D19-5305
- DOI:
- 10.18653/v1/D19-5305
- Cite (ACL):
- Matan Zuckerman and Mark Last. 2019. Using Graphs for Word Embedding with Enhanced Semantic Relations. In Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13), pages 32–41, Hong Kong. Association for Computational Linguistics.
- Cite (Informal):
- Using Graphs for Word Embedding with Enhanced Semantic Relations (Zuckerman & Last, TextGraphs 2019)
- PDF:
- https://preview.aclanthology.org/starsem-semeval-split/D19-5305.pdf