Rethinking Hallucinations: Correctness, Consistency, and Prompt Multiplicity

Prakhar Ganesh, Reza Shokri, Golnoosh Farnadi


Abstract
Large language models (LLMs) are known to "hallucinate" by generating false or misleading outputs. Hallucinations pose various harms, from erosion of trust to widespread misinformation. Existing hallucination evaluation, however, focuses only on correctness and often overlooks consistency, necessary to distinguish and address these harms. To bridge this gap, we introduce prompt multiplicity, a framework for quantifying consistency in LLM evaluations. Our analysis reveals significant multiplicity (over 50% inconsistency in benchmarks like Med-HALT), suggesting that hallucination-related harms have been severely misunderstood. Furthermore, we study the role of consistency in hallucination detection and mitigation. We find that: (a) detection techniques detect consistency, not correctness, and (b) mitigation techniques like RAG, while beneficial, can introduce additional inconsistencies. By integrating prompt multiplicity into hallucination evaluation, we provide an improved framework of potential harms and uncover critical limitations in current detection and mitigation strategies.
Anthology ID:
2026.eacl-long.327
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6959–6978
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.327/
DOI:
Bibkey:
Cite (ACL):
Prakhar Ganesh, Reza Shokri, and Golnoosh Farnadi. 2026. Rethinking Hallucinations: Correctness, Consistency, and Prompt Multiplicity. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6959–6978, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Rethinking Hallucinations: Correctness, Consistency, and Prompt Multiplicity (Ganesh et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-long.327.pdf