Jie Shen


2016

pdf bib
Improved Word Embeddings with Implicit Structure Information
Jie Shen | Cong Liu
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers

Distributed word representation is an efficient method for capturing semantic and syntactic word relations. In this work, we introduce an extension to the continuous bag-of-words model for learning word representations efficiently by using implicit structure information. Instead of relying on a syntactic parser which might be noisy and slow to build, we compute weights representing probabilities of syntactic relations based on the Huffman softmax tree in an efficient heuristic. The constructed “implicit graphs” from these weights show that these weights contain useful implicit structure information. Extensive experiments performed on several word similarity and word analogy tasks show gains compared to the basic continuous bag-of-words model.
Search
Co-authors
Venues