Di Zhou
2020
HyperText: Endowing FastText with Hyperbolic Geometry
Yudong Zhu
|
Di Zhou
|
Jinghui Xiao
|
Xin Jiang
|
Xiao Chen
|
Qun Liu
Findings of the Association for Computational Linguistics: EMNLP 2020
Natural language data exhibit tree-like hierarchical structures such as the hypernym-hyponym hierarchy in WordNet. FastText, as the state-of-the-art text classifier based on shallow neural network in Euclidean space, may not represent such hierarchies precisely with limited representation capacity. Considering that hyperbolic space is naturally suitable for modelling tree-like hierarchical data, we propose a new model named HyperText for efficient text classification by endowing FastText with hyperbolic geometry. Empirically, we show that HyperText outperforms FastText on a range of text classification tasks with much reduced parameters.
BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models
Bin He
|
Di Zhou
|
Jinghui Xiao
|
Xin Jiang
|
Qun Liu
|
Nicholas Jing Yuan
|
Tong Xu
Findings of the Association for Computational Linguistics: EMNLP 2020
Complex node interactions are common in knowledge graphs (KGs), and these interactions can be considered as contextualized knowledge exists in the topological structure of KGs. Traditional knowledge representation learning (KRL) methods usually treat a single triple as a training unit, neglecting the usage of graph contextualized knowledge. To utilize these unexploited graph-level knowledge, we propose an approach to model subgraphs in a medical KG. Then, the learned knowledge is integrated with a pre-trained language model to do the knowledge generalization. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and the improvement above MedERNIE indicates that graph contextualized knowledge is beneficial.
2018
Neural Relation Classification with Text Descriptions
Feiliang Ren
|
Di Zhou
|
Zhihui Liu
|
Yongcheng Li
|
Rongsheng Zhao
|
Yongkang Liu
|
Xiaobo Liang
Proceedings of the 27th International Conference on Computational Linguistics
Relation classification is an important task in natural language processing fields. State-of-the-art methods usually concentrate on building deep neural networks based classification models on the training data in which the relations of the labeled entity pairs are given. However, these methods usually suffer from the data sparsity issue greatly. On the other hand, we notice that it is very easily to obtain some concise text descriptions for almost all of the entities in a relation classification task. The text descriptions can provide helpful supplementary information for relation classification. But they are ignored by most of existing methods. In this paper, we propose DesRC, a new neural relation classification method which integrates entities’ text descriptions into deep neural networks models. We design a two-level attention mechanism to select the most useful information from the “intra-sentence” aspect and the “cross-sentence” aspect. Besides, the adversarial training method is also used to further improve the classification per-formance. Finally, we evaluate the proposed method on the SemEval 2010 dataset. Extensive experiments show that our method achieves much better experimental results than other state-of-the-art relation classification methods.
Search
Co-authors
- Jinghui Xiao 2
- Xin Jiang 2
- Qun Liu 2
- Feiliang Ren 1
- Zhihui Liu 1
- show all...