Explanations explained. Influence of Free-text Explanations on LLMs and the Role of Implicit Knowledge

Andrea Zaninello, Roberto Dessi, Malvina Nissim, Bernardo Magnini


Abstract
In this work, we investigate the relationship between the quality of explanations produced by different models and the amount of implicit knowledge the are able to provide beyond the input. We approximate explanation quality via accuracy on a downstream task with a standardized pipeline (GEISER) and study its correlation with three different association measures, each capturing different aspects of implicitness, defined as a combination of relevance and novelty. We conduct experiments with three SOTA LLMs on four tasks involving implicit knowledge, with explanations either confirming or contradicting the correct label. Our results demonstrate that providing quality explanations consistently improves the accuracy of LLM predictions, even when the models are not explicitly trained to take explanations as input, and underline the correlation between implicit content delivered by the explanation and its effectiveness.
Anthology ID:
2025.starsem-1.17
Volume:
Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Lea Frermann, Mark Stevenson
Venue:
*SEM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
212–224
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.17/
DOI:
Bibkey:
Cite (ACL):
Andrea Zaninello, Roberto Dessi, Malvina Nissim, and Bernardo Magnini. 2025. Explanations explained. Influence of Free-text Explanations on LLMs and the Role of Implicit Knowledge. In Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025), pages 212–224, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Explanations explained. Influence of Free-text Explanations on LLMs and the Role of Implicit Knowledge (Zaninello et al., *SEM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.17.pdf