Zhaohui Wu


2017

pdf
Determining Gains Acquired from Word Embedding Quantitatively Using Discrete Distribution Clustering
Jianbo Ye | Yanran Li | Zhaohui Wu | James Z. Wang | Wenjie Li | Jia Li
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Word embeddings have become widely-used in document analysis. While a large number of models for mapping words to vector spaces have been developed, it remains undetermined how much net gain can be achieved over traditional approaches based on bag-of-words. In this paper, we propose a new document clustering approach by combining any word embedding with a state-of-the-art algorithm for clustering empirical distributions. By using the Wasserstein distance between distributions, the word-to-word semantic relationship is taken into account in a principled way. The new clustering method is easy to use and consistently outperforms other methods on a variety of data sets. More importantly, the method provides an effective framework for determining when and how much word embeddings contribute to document analysis. Experimental results with multiple embedding models are reported.

2015

pdf
Measuring Prerequisite Relations Among Concepts
Chen Liang | Zhaohui Wu | Wenyi Huang | C. Lee Giles
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf
Storybase: Towards Building a Knowledge Base for News Events
Zhaohui Wu | Chen Liang | C. Lee Giles
Proceedings of ACL-IJCNLP 2015 System Demonstrations

2013

pdf
Measuring Term Informativeness in Context
Zhaohui Wu | C. Lee Giles
Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies