Entity Embedding Completion for Wide-Coverage Entity Disambiguation

Daisuke Oba, Ikuya Yamada, Naoki Yoshinaga, Masashi Toyoda


Abstract
Entity disambiguation (ED) is typically solved by learning to classify a given mention into one of the entities in the model’s entity vocabulary by referring to their embeddings. However, this approach cannot address mentions of entities that are not covered by the entity vocabulary. Aiming to enhance the applicability of ED models, we propose a method of extending a state-of-the-art ED model by dynamically computing embeddings of out-of-vocabulary entities. Specifically, our method computes embeddings from entity descriptions and mention contexts. Experiments with standard benchmark datasets show that the extended model performs comparable to or better than existing models whose entity embeddings are trained for all candidate entities as well as embedding-free models. We release our source code and model checkpoints at https://github.com/studio-ousia/steel.
Anthology ID:
2022.findings-emnlp.472
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6333–6344
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.472
DOI:
10.18653/v1/2022.findings-emnlp.472
Bibkey:
Cite (ACL):
Daisuke Oba, Ikuya Yamada, Naoki Yoshinaga, and Masashi Toyoda. 2022. Entity Embedding Completion for Wide-Coverage Entity Disambiguation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6333–6344, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Entity Embedding Completion for Wide-Coverage Entity Disambiguation (Oba et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.472.pdf