It’s All About the Confidence: An Unsupervised Approach for Multilingual Historical Entity Linking using Large Language Models

Cristian Santini, Marieke van Erp, Mehwish Alam


Abstract
Despite the recent advancements in NLP with the advent of Large Language Models (LLMs), Entity Linking (EL) for historical texts remains challenging due to linguistic variation, noisy inputs, and evolving semantic conventions. Existing solutions either require substantial training data or rely on domain-specific rules that limit scalability. In this paper, we present MHEL-LLaMo (Multilingual Historical Entity Linking with Large Language MOdels), an unsupervised ensemble approach combining a Small Language Model (SLM) and an LLM. MHEL-LLaMo leverages a multilingual bi-encoder (BELA) for candidate retrieval and an instruction-tuned LLM for NIL prediction and candidate selection via prompt chaining. Our system uses SLM’s confidence scores to discriminate between easy and hard samples, applying an LLM only for hard cases. This strategy reduces computational costs while preventing hallucinations on straightforward cases. We evaluate MHEL-LLaMo on four established benchmarks in six European languages (English, Finnish, French, German, Italian and Swedish) from the 19th and 20th centuries. Results demonstrate that MHEL-LLaMo outperforms state-of-the-art models without requiring fine-tuning, offering a scalable solution for low-resource historical EL. Our error analysis reveals that 41% of false predictions exhibit semantic proximity to ground truth entities, highlighting the LLM’s accurate disambiguation of historical references.
Anthology ID:
2026.eacl-long.184
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3939–3954
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.184/
DOI:
Bibkey:
Cite (ACL):
Cristian Santini, Marieke van Erp, and Mehwish Alam. 2026. It’s All About the Confidence: An Unsupervised Approach for Multilingual Historical Entity Linking using Large Language Models. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3939–3954, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
It’s All About the Confidence: An Unsupervised Approach for Multilingual Historical Entity Linking using Large Language Models (Santini et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.184.pdf