Mohamed H. Gad-Elrab
Also published as: Mohamed H. Gad-Elrab
2022
A Study on Entity Linking Across Domains: Which Data is Best for Fine-Tuning?
Hassan Soliman
|
Heike Adel
|
Mohamed H. Gad-Elrab
|
Dragan Milchevski
|
Jannik Strötgen
Proceedings of the 7th Workshop on Representation Learning for NLP
Entity linking disambiguates mentions by mapping them to entities in a knowledge graph (KG). One important question in today’s research is how to extend neural entity linking systems to new domains. In this paper, we aim at a system that enables linking mentions to entities from a general-domain KG and a domain-specific KG at the same time. In particular, we represent the entities of different KGs in a joint vector space and address the questions of which data is best suited for creating and fine-tuning that space, and whether fine-tuning harms performance on the general domain. We find that a combination of data from both the general and the special domain is most helpful. The first is especially necessary for avoiding performance loss on the general domain. While additional supervision on entities that appear in both KGs performs best in an intrinsic evaluation of the vector space, it has less impact on the downstream task of entity linking.
2015
EDRAK: Entity-Centric Data Resource for Arabic Knowledge
Mohamed H. Gad-Elrab
|
Mohamed Amir Yosef
|
Gerhard Weikum
Proceedings of the Second Workshop on Arabic Natural Language Processing
Search
Co-authors
- Hassan Soliman 1
- Heike Adel 1
- Dragan Milchevski 1
- Jannik Strötgen 1
- Mohamed Amir Yosef 1
- show all...