Effective Attention Modeling for Neural Relation Extraction

Tapas Nayak, Hwee Tou Ng


Abstract
Relation extraction is the task of determining the relation between two entities in a sentence. Distantly-supervised models are popular for this task. However, sentences can be long and two entities can be located far from each other in a sentence. The pieces of evidence supporting the presence of a relation between two entities may not be very direct, since the entities may be connected via some indirect links such as a third entity or via co-reference. Relation extraction in such scenarios becomes more challenging as we need to capture the long-distance interactions among the entities and other words in the sentence. Also, the words in a sentence do not contribute equally in identifying the relation between the two entities. To address this issue, we propose a novel and effective attention model which incorporates syntactic information of the sentence and a multi-factor attention mechanism. Experiments on the New York Times corpus show that our proposed model outperforms prior state-of-the-art models.
Anthology ID:
K19-1056
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
603–612
Language:
URL:
https://aclanthology.org/K19-1056
DOI:
10.18653/v1/K19-1056
Bibkey:
Cite (ACL):
Tapas Nayak and Hwee Tou Ng. 2019. Effective Attention Modeling for Neural Relation Extraction. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 603–612, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Effective Attention Modeling for Neural Relation Extraction (Nayak & Ng, CoNLL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/K19-1056.pdf
Code
 nusnlp/MFA4RE