Language Models Grow Less Humanlike beyond Phase Transition

Tatsuya Aoyama, Ethan Wilcox


Abstract
LMs’ alignment with human reading behavior (i.e. psychometric predictive power; PPP) is known to improve during pretraining up to a tipping point, beyond which it either plateaus or degrades. Various factors, such as word frequency, recency bias in attention, and context size, have been theorized to affect PPP, yet there is no current account that explains why such a tipping point exists, and how it interacts with LMs’ pretraining dynamics more generally. We hypothesize that the underlying factor is a pretraining phase transition, characterized by the rapid emergence of specialized attention heads. We conduct a series of correlational and causal experiments to show that such a phase transition is responsible for the tipping point in PPP. We then show that, rather than producing attention patterns that contribute to the degradation in PPP, phase transitions alter the subsequent learning dynamics of the model, such that further training keeps damaging PPP.
Anthology ID:
2025.acl-long.1214
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24938–24958
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1214/
DOI:
Bibkey:
Cite (ACL):
Tatsuya Aoyama and Ethan Wilcox. 2025. Language Models Grow Less Humanlike beyond Phase Transition. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 24938–24958, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Language Models Grow Less Humanlike beyond Phase Transition (Aoyama & Wilcox, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1214.pdf