Un-considering Contextual Information: Assessing LLMs’ Understanding of Indexical Elements

Metehan Oğuz, Yavuz Faruk Bakman, Duygu Nur Yaldiz


Abstract
Large Language Models (LLMs) have demonstrated impressive performances in tasks related to coreference resolution. However, previous studies mostly assessed LLM performance on coreference resolution with nouns and third person pronouns. This study evaluates LLM performance on coreference resolution with indexical like I, you, here and tomorrow which come with unique challenges due to their linguistic properties. We present the first study examining how LLMs interpret indexicals in English, releasing the English Indexical Dataset with 1600 multiple-choice questions. We evaluate pioneering LLMs, including GPT-4o, Claude 3.5 Sonnet, Gemini 1.5 Pro, and DeepSeek V3. Our results reveal that LLMs exhibit an impressive performance with some indexicals (I), while struggling with others (you, here, tomorrow), and that syntactic cues (e.g. quotation) contribute to LLM performance with some indexicals, while they reduce performance with others. Code and data are available at: https://github.com/metehanoguzz/LLMs-Indexicals-English
Anthology ID:
2025.findings-acl.1203
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23410–23427
Language:
URL:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1203/
DOI:
10.18653/v1/2025.findings-acl.1203
Bibkey:
Cite (ACL):
Metehan Oğuz, Yavuz Faruk Bakman, and Duygu Nur Yaldiz. 2025. Un-considering Contextual Information: Assessing LLMs’ Understanding of Indexical Elements. In Findings of the Association for Computational Linguistics: ACL 2025, pages 23410–23427, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Un-considering Contextual Information: Assessing LLMs’ Understanding of Indexical Elements (Oğuz et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1203.pdf