Drop Dropout on Single Epoch Language Model Pretraining

Houjun Liu, John Bauer, Christopher D Manning


Abstract
Originally, dropout was seen as a breakthrough regularization technique that reduced overfitting and improved performance in almost all applications of deep learning by reducing overfitting. Yet, single-epoch pretraining tasks common to modern LLMs yield minimal overfitting, leading to dropout not being used for large LLMs. Nevertheless, no thorough empirical investigation has been done on the role of dropout in LM pretraining. Through experiments in single-epoch pretraining of both masked (BERT) and autoregressive (Pythia 160M and 1.4B) LMs with varying levels of dropout, we find that downstream performance in language modeling, morpho-syntax (BLiMP), question answering (SQuAD), and natural-language inference (MNLI) improves when dropout is not applied during pretraining. We additionally find that the recently-introduced “early dropout” also degrades performance over applying no dropout at all. We further investigate the models’ editability, and find that models trained without dropout are more successful in gradient-based model editing (MEND) and equivalent in representation-based model editing (ReFT). Therefore, we advocate to **drop dropout** during single-epoch pretraining.
Anthology ID:
2025.findings-acl.111
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2157–2166
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.111/
DOI:
Bibkey:
Cite (ACL):
Houjun Liu, John Bauer, and Christopher D Manning. 2025. Drop Dropout on Single Epoch Language Model Pretraining. In Findings of the Association for Computational Linguistics: ACL 2025, pages 2157–2166, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Drop Dropout on Single Epoch Language Model Pretraining (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.111.pdf