Ruoxu Wang
2020
Logic-guided Semantic Representation Learning for Zero-Shot Relation Classification
Juan Li
|
Ruoxu Wang
|
Ningyu Zhang
|
Wen Zhang
|
Fan Yang
|
Huajun Chen
Proceedings of the 28th International Conference on Computational Linguistics
Relation classification aims to extract semantic relations between entity pairs from the sentences. However, most existing methods can only identify seen relation classes that occurred during training. To recognize unseen relations at test time, we explore the problem of zero-shot relation classification. Previous work regards the problem as reading comprehension or textual entailment, which have to rely on artificial descriptive information to improve the understandability of relation types. Thus, rich semantic knowledge of the relation labels is ignored. In this paper, we propose a novel logic-guided semantic representation learning model for zero-shot relation classification. Our approach builds connections between seen and unseen relations via implicit and explicit semantic representations with knowledge graph embeddings and logic rules. Extensive experimental results demonstrate that our method can generalize to unseen relation types and achieve promising improvements.
2018
Label-Free Distant Supervision for Relation Extraction via Knowledge Graph Embedding
Guanying Wang
|
Wen Zhang
|
Ruoxu Wang
|
Yalin Zhou
|
Xi Chen
|
Wei Zhang
|
Hai Zhu
|
Huajun Chen
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Distant supervision is an effective method to generate large scale labeled data for relation extraction, which assumes that if a pair of entities appears in some relation of a Knowledge Graph (KG), all sentences containing those entities in a large unlabeled corpus are then labeled with that relation to train a relation classifier. However, when the pair of entities has multiple relationships in the KG, this assumption may produce noisy relation labels. This paper proposes a label-free distant supervision method, which makes no use of the relation labels under this inadequate assumption, but only uses the prior knowledge derived from the KG to supervise the learning of the classifier directly and softly. Specifically, we make use of the type information and the translation law derived from typical KG embedding model to learn embeddings for certain sentence patterns. As the supervision signal is only determined by the two aligned entities, neither hard relation labels nor extra noise-reduction model for the bag of sentences is needed in this way. The experiments show that the approach performs well in current distant supervision dataset.
Search
Co-authors
- Wen Zhang 2
- Huajun Chen 2
- Juan Li 1
- Ningyu Zhang 1
- Fan Yang 1
- show all...