On the Role of Linguistic Features in LLM Performance on Theory of Mind Tasks

Ekaterina Kozachenko, Gonçalo Guiomar, Karolina Stanczak


Abstract
Theory of Mind presents a fundamental challenge for Large Language Models (LLMs), revealing gaps in processing intensional contexts where beliefs diverge from reality. We analyze six LLMs across 2,860 annotated stories, measuring factors such as idea density, mental state verb distribution, and perspectival complexity markers. Notably, and in contrast to humans, we find that LLMs show positive correlations with linguistic complexity. In fact, they achieve high accuracy (74-95%) on high complexity stories with explicit mental state scaffolding, yet struggle with low complexity tasks requiring implicit reasoning (51-77%). Furthermore, we find that linguistic markers systematically influence performance, with contrast markers decreasing accuracy by 5-9% and knowledge verbs increasing it by 4-10%. This inverse relationship between linguistic complexity and performance, contrary to human cognition, may suggest that current LLMs rely on surface-level linguistic cues rather than genuine mental state reasoning.
Anthology ID:
2025.iwcs-1.27
Volume:
Proceedings of the 16th International Conference on Computational Semantics
Month:
September
Year:
2025
Address:
Düsseldorf, Germany
Editors:
Kilian Evang, Laura Kallmeyer, Sylvain Pogodalla
Venues:
IWCS | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
308–316
Language:
URL:
https://preview.aclanthology.org/iwcs-25-ingestion/2025.iwcs-1.27/
DOI:
Bibkey:
Cite (ACL):
Ekaterina Kozachenko, Gonçalo Guiomar, and Karolina Stanczak. 2025. On the Role of Linguistic Features in LLM Performance on Theory of Mind Tasks. In Proceedings of the 16th International Conference on Computational Semantics, pages 308–316, Düsseldorf, Germany. Association for Computational Linguistics.
Cite (Informal):
On the Role of Linguistic Features in LLM Performance on Theory of Mind Tasks (Kozachenko et al., IWCS 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/iwcs-25-ingestion/2025.iwcs-1.27.pdf