Let’s Stop Incorrect Comparisons in End-to-end Relation Extraction!

Bruno Taillé, Vincent Guigue, Geoffrey Scoutheeten, Patrick Gallinari


Abstract
Despite efforts to distinguish three different evaluation setups (Bekoulis et al., 2018), numerous end-to-end Relation Extraction (RE) articles present unreliable performance comparison to previous work. In this paper, we first identify several patterns of invalid comparisons in published papers and describe them to avoid their propagation. We then propose a small empirical study to quantify the most common mistake’s impact and evaluate it leads to overestimating the final RE performance by around 5% on ACE05. We also seize this opportunity to study the unexplored ablations of two recent developments: the use of language model pretraining (specifically BERT) and span-level NER. This meta-analysis emphasizes the need for rigor in the report of both the evaluation setting and the dataset statistics. We finally call for unifying the evaluation setting in end-to-end RE.
Anthology ID:
2020.emnlp-main.301
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3689–3701
Language:
URL:
https://aclanthology.org/2020.emnlp-main.301
DOI:
10.18653/v1/2020.emnlp-main.301
Bibkey:
Cite (ACL):
Bruno Taillé, Vincent Guigue, Geoffrey Scoutheeten, and Patrick Gallinari. 2020. Let’s Stop Incorrect Comparisons in End-to-end Relation Extraction!. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3689–3701, Online. Association for Computational Linguistics.
Cite (Informal):
Let’s Stop Incorrect Comparisons in End-to-end Relation Extraction! (Taillé et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2020.emnlp-main.301.pdf
Optional supplementary material:
 2020.emnlp-main.301.OptionalSupplementaryMaterial.zip
Video:
 https://slideslive.com/38938999
Code
 btaille/sincere +  additional community code
Data
SciERC