How Fragile is Relation Extraction under Entity Replacements?

Yiwei Wang, Bryan Hooi, Fei Wang, Yujun Cai, Yuxuan Liang, Wenxuan Zhou, Jing Tang, Manjuan Duan, Muhao Chen


Abstract
Relation extraction (RE) aims to extract the relations between entity names from the textual context. In principle, textual context determines the ground-truth relation and the RE models should be able to correctly identify the relations reflected by the textual context. However, existing work has found that the RE models memorize the entity name patterns to make RE predictions while ignoring the textual context. This motivates us to raise the question: are RE models robust to the entity replacements? In this work, we operate the random and type-constrained entity replacements over the RE instances in TACRED and evaluate the state-of-the-art RE models under the entity replacements. We observe the 30% - 50% F1 score drops on the state-of-the-art RE models under entity replacements. These results suggest that we need more efforts to develop effective RE models robust to entity replacements. We release the source code at https://github.com/wangywUST/RobustRE.
Anthology ID:
2023.conll-1.27
Volume:
Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL)
Month:
December
Year:
2023
Address:
Singapore
Editors:
Jing Jiang, David Reitter, Shumin Deng
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
414–423
Language:
URL:
https://aclanthology.org/2023.conll-1.27
DOI:
10.18653/v1/2023.conll-1.27
Bibkey:
Cite (ACL):
Yiwei Wang, Bryan Hooi, Fei Wang, Yujun Cai, Yuxuan Liang, Wenxuan Zhou, Jing Tang, Manjuan Duan, and Muhao Chen. 2023. How Fragile is Relation Extraction under Entity Replacements?. In Proceedings of the 27th Conference on Computational Natural Language Learning (CoNLL), pages 414–423, Singapore. Association for Computational Linguistics.
Cite (Informal):
How Fragile is Relation Extraction under Entity Replacements? (Wang et al., CoNLL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.conll-1.27.pdf