Relation Extraction with Word Graphs from N-grams

Han Qin, Yuanhe Tian, Yan Song


Abstract
Most recent studies for relation extraction (RE) leverage the dependency tree of the input sentence to incorporate syntax-driven contextual information to improve model performance, with little attention paid to the limitation where high-quality dependency parsers in most cases unavailable, especially for in-domain scenarios. To address this limitation, in this paper, we propose attentive graph convolutional networks (A-GCN) to improve neural RE methods with an unsupervised manner to build the context graph, without relying on the existence of a dependency parser. Specifically, we construct the graph from n-grams extracted from a lexicon built from pointwise mutual information (PMI) and apply attention over the graph. Therefore, different word pairs from the contexts within and across n-grams are weighted in the model and facilitate RE accordingly. Experimental results with further analyses on two English benchmark datasets for RE demonstrate the effectiveness of our approach, where state-of-the-art performance is observed on both datasets.
Anthology ID:
2021.emnlp-main.228
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2860–2868
Language:
URL:
https://aclanthology.org/2021.emnlp-main.228
DOI:
10.18653/v1/2021.emnlp-main.228
Bibkey:
Cite (ACL):
Han Qin, Yuanhe Tian, and Yan Song. 2021. Relation Extraction with Word Graphs from N-grams. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2860–2868, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Relation Extraction with Word Graphs from N-grams (Qin et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.emnlp-main.228.pdf
Data
SemEval-2010 Task 8