Neural Collective Entity Linking

Yixin Cao, Lei Hou, Juanzi Li, Zhiyuan Liu


Abstract
Entity Linking aims to link entity mentions in texts to knowledge bases, and neural models have achieved recent success in this task. However, most existing methods rely on local contexts to resolve entities independently, which may usually fail due to the data sparsity of local information. To address this issue, we propose a novel neural model for collective entity linking, named as NCEL. NCEL apply Graph Convolutional Network to integrate both local contextual features and global coherence information for entity linking. To improve the computation efficiency, we approximately perform graph convolution on a subgraph of adjacent entity mentions instead of those in the entire text. We further introduce an attention scheme to improve the robustness of NCEL to data noise and train the model on Wikipedia hyperlinks to avoid overfitting and domain bias. In experiments, we evaluate NCEL on five publicly available datasets to verify the linking performance as well as generalization ability. We also conduct an extensive analysis of time complexity, the impact of key modules, and qualitative results, which demonstrate the effectiveness and efficiency of our proposed method.
Anthology ID:
C18-1057
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
675–686
Language:
URL:
https://aclanthology.org/C18-1057
DOI:
Bibkey:
Cite (ACL):
Yixin Cao, Lei Hou, Juanzi Li, and Zhiyuan Liu. 2018. Neural Collective Entity Linking. In Proceedings of the 27th International Conference on Computational Linguistics, pages 675–686, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Neural Collective Entity Linking (Cao et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/C18-1057.pdf
Code
 TaoMiner/NCEL