Scaling Laws Are Unreliable for Downstream Tasks: A Reality Check

Nicholas Lourie, Michael Y. Hu, Kyunghyun Cho


Abstract
Downstream scaling laws aim to predict task performance at larger scales from the model’s performance at smaller scales. Whether such prediction should be possible is unclear: some works discover clear linear scaling trends after simple transformations of the performance metric, whereas others point out fundamental challenges to downstream scaling laws, such as emergence and inverse scaling. In this work, we conduct a meta-analysis of existing data on downstream scaling laws, and we find that predictable scaling only occurs in a minority of cases: 39% of the time. Moreover, seemingly benign changes to the experimental setting can completely change the scaling behavior. Our analysis underscores the need to understand the conditions under which scaling laws succeed. To accurately model the relationship between pretraining loss and task performance, we must embrace the cases in which scaling behavior deviates from linear trends.
Anthology ID:
2025.findings-emnlp.877
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16167–16180
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.877/
DOI:
10.18653/v1/2025.findings-emnlp.877
Bibkey:
Cite (ACL):
Nicholas Lourie, Michael Y. Hu, and Kyunghyun Cho. 2025. Scaling Laws Are Unreliable for Downstream Tasks: A Reality Check. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 16167–16180, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Scaling Laws Are Unreliable for Downstream Tasks: A Reality Check (Lourie et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.877.pdf
Checklist:
 2025.findings-emnlp.877.checklist.pdf