Convergence and Divergence of Language Models under Different Random Seeds

Finlay Fehlauer, Kyle Mahowald, Tiago Pimentel


Abstract
In this paper, we investigate the convergence of language models (LMs) trained under different random seeds, measuring convergence as the expected per-token Kullback–Leibler (KL) divergence across seeds. By comparing LM convergence as a function of model size and training checkpoint, we identify a four-phase convergence pattern: (i) an initial uniform phase, (ii) a sharp-convergence phase, (iii) a sharp-divergence phase, and (iv) a slow-reconvergence phase. Further, we observe that larger models reconverge faster in later training stages, while smaller models never actually reconverge; these results suggest that a certain model size may be necessary to learn stable distributions. Restricting our analysis to specific token frequencies, or part-of-speech (PoS) tags further reveals that convergence is uneven across linguistic categories: frequent tokens and function words converge faster and more reliably than their counterparts (infrequent tokens and content words). Overall, our findings highlight factors that influence the stability of the learned distributions in model training.
Anthology ID:
2025.emnlp-main.1675
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32970–32979
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1675/
DOI:
Bibkey:
Cite (ACL):
Finlay Fehlauer, Kyle Mahowald, and Tiago Pimentel. 2025. Convergence and Divergence of Language Models under Different Random Seeds. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 32970–32979, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Convergence and Divergence of Language Models under Different Random Seeds (Fehlauer et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1675.pdf
Checklist:
 2025.emnlp-main.1675.checklist.pdf