Evaluation of named entity coreference

Oshin Agarwal, Sanjay Subramanian, Ani Nenkova, Dan Roth


Abstract
In many NLP applications like search and information extraction for named entities, it is necessary to find all the mentions of a named entity, some of which appear as pronouns (she, his, etc.) or nominals (the professor, the German chancellor, etc.). It is therefore important that coreference resolution systems are able to link these different types of mentions to the correct entity name. We evaluate state-of-the-art coreference resolution systems for the task of resolving all mentions to named entities. Our analysis reveals that standard coreference metrics do not reflect adequately the requirements in this task: they do not penalize systems for not identifying any mentions by name to an entity and they reward systems even if systems find correctly mentions to the same entity but fail to link these to a proper name (she–the student–no name). We introduce new metrics for evaluating named entity coreference that address these discrepancies and show that for the comparisons of competitive systems, standard coreference evaluations could give misleading results for this task. We are, however, able to confirm that the state-of-the art system according to traditional evaluations also performs vastly better than other systems on the named entity coreference task.
Anthology ID:
W19-2801
Volume:
Proceedings of the Second Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
June
Year:
2019
Address:
Minneapolis, USA
Editors:
Maciej Ogrodniczuk, Sameer Pradhan, Yulia Grishina, Vincent Ng
Venue:
CRAC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://aclanthology.org/W19-2801
DOI:
10.18653/v1/W19-2801
Bibkey:
Cite (ACL):
Oshin Agarwal, Sanjay Subramanian, Ani Nenkova, and Dan Roth. 2019. Evaluation of named entity coreference. In Proceedings of the Second Workshop on Computational Models of Reference, Anaphora and Coreference, pages 1–7, Minneapolis, USA. Association for Computational Linguistics.
Cite (Informal):
Evaluation of named entity coreference (Agarwal et al., CRAC 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/W19-2801.pdf
Supplementary:
 W19-2801.Supplementary.pdf