Abstract
Recent work in entity disambiguation (ED) has typically neglected structured knowledge base (KB) facts, and instead relied on a limited subset of KB information, such as entity descriptions or types. This limits the range of contexts in which entities can be disambiguated. To allow the use of all KB facts, as well as descriptions and types, we introduce an ED model which links entities by reasoning over a symbolic knowledge base in a fully differentiable fashion. Our model surpasses state-of-the-art baselines on six well-established ED datasets by 1.3 F1 on average. By allowing access to all KB information, our model is less reliant on popularity-based entity priors, and improves performance on the challenging ShadowLink dataset (which emphasises infrequent and ambiguous entities) by 12.7 F1.- Anthology ID:
- 2022.naacl-main.210
- Volume:
- Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
- Month:
- July
- Year:
- 2022
- Address:
- Seattle, United States
- Editors:
- Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2899–2912
- Language:
- URL:
- https://aclanthology.org/2022.naacl-main.210
- DOI:
- 10.18653/v1/2022.naacl-main.210
- Cite (ACL):
- Tom Ayoola, Joseph Fisher, and Andrea Pierleoni. 2022. Improving Entity Disambiguation by Reasoning over a Knowledge Base. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2899–2912, Seattle, United States. Association for Computational Linguistics.
- Cite (Informal):
- Improving Entity Disambiguation by Reasoning over a Knowledge Base (Ayoola et al., NAACL 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2022.naacl-main.210.pdf
- Code
- additional community code
- Data
- ACE 2004, AIDA CoNLL-YAGO, AQUAINT, CoNLL, DocRED