Factual Knowledge in Language Models: Robustness and Anomalies under Simple Temporal Context Variations

Hichem Ammar Khodja, Frederic Bechet, Quentin Brabant, Alexis Nasr, Gwénolé Lecorvé


Abstract
This paper explores the robustness of language models (LMs) to variations in the temporal context within factual knowledge. It examines whether LMs can correctly associate a temporal context with a past fact valid over a defined period, by asking them to differentiate correct from incorrect contexts. The LMs’ ability to distinguish is analyzed along two dimensions: the distance of the incorrect context from the validity period and the granularity of the context. To this end, a dataset called TimeStress is introduced, enabling the evaluation of 18 diverse LMs. Results reveal that the best LM achieves a perfect distinction for only 11% of the studied facts, with errors, certainly rare, but critical that humans would not make. This work highlights the limitations of current LMs in temporal representation.
Anthology ID:
2025.l2m2-1.1
Volume:
Proceedings of the First Workshop on Large Language Model Memorization (L2M2)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Robin Jia, Eric Wallace, Yangsibo Huang, Tiago Pimentel, Pratyush Maini, Verna Dankers, Johnny Wei, Pietro Lesci
Venues:
L2M2 | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–22
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.l2m2-1.1/
DOI:
10.18653/v1/2025.l2m2-1.1
Bibkey:
Cite (ACL):
Hichem Ammar Khodja, Frederic Bechet, Quentin Brabant, Alexis Nasr, and Gwénolé Lecorvé. 2025. Factual Knowledge in Language Models: Robustness and Anomalies under Simple Temporal Context Variations. In Proceedings of the First Workshop on Large Language Model Memorization (L2M2), pages 1–22, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Factual Knowledge in Language Models: Robustness and Anomalies under Simple Temporal Context Variations (Khodja et al., L2M2 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.l2m2-1.1.pdf