Can LLMs Ground when they (Don’t) Know: A Study on Direct and Loaded Political Questions

Clara Lachenmaier, Judith Sieker, Sina Zarrieß


Abstract
Communication among humans relies on conversational grounding, allowing interlocutors to reach mutual understanding even when they do not have perfect knowledge and must resolve discrepancies in each other’s beliefs. This paper investigates how large language models (LLMs) manage common ground in cases where they (don’t) possess knowledge, focusing on facts in the political domain where the risk of misinformation and grounding failure is high. We examine LLMs’ ability to answer direct knowledge questions and loaded questions that presuppose misinformation.We evaluate whether loaded questions lead LLMs to engage in active grounding and correct false user beliefs, in connection to their level of knowledge and their political bias.Our findings highlight significant challenges in LLMs’ ability to engage in grounding and reject false user beliefs, raising concerns about their role in mitigating misinformation in political discourse.
Anthology ID:
2025.acl-long.728
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14956–14975
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.728/
DOI:
Bibkey:
Cite (ACL):
Clara Lachenmaier, Judith Sieker, and Sina Zarrieß. 2025. Can LLMs Ground when they (Don’t) Know: A Study on Direct and Loaded Political Questions. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14956–14975, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Can LLMs Ground when they (Don’t) Know: A Study on Direct and Loaded Political Questions (Lachenmaier et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.728.pdf