De Meng
2025
Leveraging the Power of Large Language Models in Entity Linking via Adaptive Routing and Targeted Reasoning
Yajie Li
|
Albert Galimov
|
Mitra Datta Ganapaneni
|
Pujitha Thejaswi
|
De Meng
|
Priyanshu Kumar
|
Saloni Potdar
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Entity Linking (EL) has traditionally relied on large annotated datasets and extensive model fine-tuning. While recent few-shot methods leverage large language models (LLMs) through prompting to reduce training requirements, they often suffer from inefficiencies due to expensive LLM-based reasoning. ARTER (Adaptive Routing and Targeted Entity Reasoning) presents a structured pipeline that achieves high performance without deep fine-tuning by strategically combining candidate generation, context-based scoring, adaptive routing, and selective reasoning. ARTER computes a small set of complementary signals(both embedding and LLM-based) over the retrieved candidates to categorize contextual mentions into easy and hard cases. The cases are then handled by a low-computational entity linker (e.g. ReFinED) and more expensive targeted LLM-based reasoning respectively. On standard benchmarks, ARTER outperforms ReFinED by up to +4.47%, with an average gain of +2.53% on 5 out of 6 datasets, and performs comparably to pipelines using LLM-based reasoning for all mentions, while being as twice as efficient in terms of the number of LLM tokens.
Search
Fix author
Co-authors
- Albert Galimov 1
- Mitra Datta Ganapaneni 1
- Priyanshu Kumar 1
- Yajie Li 1
- Saloni Potdar 1
- show all...