Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions

Zhi-Xiu Ye, Zhen-Hua Ling


Abstract
This paper presents a neural relation extraction method to deal with the noisy training data generated by distant supervision. Previous studies mainly focus on sentence-level de-noising by designing neural networks with intra-bag attentions. In this paper, both intra-bag and inter-bag attentions are considered in order to deal with the noise at sentence-level and bag-level respectively. First, relation-aware bag representations are calculated by weighting sentence embeddings using intra-bag attentions. Here, each possible relation is utilized as the query for attention calculation instead of only using the target relation in conventional methods. Furthermore, the representation of a group of bags in the training set which share the same relation label is calculated by weighting bag representations using a similarity-based inter-bag attention module. Finally, a bag group is utilized as a training sample when building our relation extractor. Experimental results on the New York Times dataset demonstrate the effectiveness of our proposed intra-bag and inter-bag attention modules. Our method also achieves better relation extraction accuracy than state-of-the-art methods on this dataset.
Anthology ID:
N19-1288
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2810–2819
Language:
URL:
https://aclanthology.org/N19-1288
DOI:
10.18653/v1/N19-1288
Bibkey:
Cite (ACL):
Zhi-Xiu Ye and Zhen-Hua Ling. 2019. Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2810–2819, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions (Ye & Ling, NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/N19-1288.pdf
Code
 ZhixiuYe/Intra-Bag-and-Inter-Bag-Attentions