Entity Linking via Explicit Mention-Mention Coreference Modeling

Dhruv Agarwal, Rico Angell, Nicholas Monath, Andrew McCallum


Abstract
Learning representations of entity mentions is a core component of modern entity linking systems for both candidate generation and making linking predictions. In this paper, we present and empirically analyze a novel training approach for learning mention and entity representations that is based on building minimum spanning arborescences (i.e., directed spanning trees) over mentions and entities across documents to explicitly model mention coreference relationships. We demonstrate the efficacy of our approach by showing significant improvements in both candidate generation recall and linking accuracy on the Zero-Shot Entity Linking dataset and MedMentions, the largest publicly available biomedical dataset. In addition, we show that our improvements in candidate generation yield higher quality re-ranking models downstream, setting a new SOTA result in linking accuracy on MedMentions. Finally, we demonstrate that our improved mention representations are also effective for the discovery of new entities via cross-document coreference.
Anthology ID:
2022.naacl-main.343
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4644–4658
Language:
URL:
https://aclanthology.org/2022.naacl-main.343
DOI:
10.18653/v1/2022.naacl-main.343
Bibkey:
Cite (ACL):
Dhruv Agarwal, Rico Angell, Nicholas Monath, and Andrew McCallum. 2022. Entity Linking via Explicit Mention-Mention Coreference Modeling. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4644–4658, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Entity Linking via Explicit Mention-Mention Coreference Modeling (Agarwal et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.naacl-main.343.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.naacl-main.343.mp4
Code
 dhdhagar/arboEL
Data
MedMentionsZESHEL