Rethinking Data Mixture for Large Language Models: A Comprehensive Survey and New Perspectives

Yajiao Liu, Congliang Chen, Junchi Yang, Ruoyu Sun


Abstract
Training large language models with data collected from various domains can improve their performance on downstream tasks. However, given a fixed training budget, the sampling proportions of these different domains significantly impact the model’s performance. How can we determine the domain weights across different data domains to train the best-performing model within constrained computational resources? In this paper, we provide a comprehensive overview of existing data mixture methods. First, we propose a fine-grained categorization of existing methods, extending beyond the previous offline and online classification. Offline methods are further grouped into heuristic-based, algorithm-based, and function fitting-based methods. For online methods, we categorize them into three groups—online min-max optimization, online mixing law, and other approaches—by drawing connections with the optimization frameworks underlying offline methods. Second, we summarize the problem formulations, representative algorithms for each subtype of offline and online methods, and clarify the relationships and distinctions among them. Finally, we discuss the advantages and disadvantages of each method and highlight key challenges in the field of data mixture.
Anthology ID:
2026.findings-eacl.15
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
275–289
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.15/
DOI:
Bibkey:
Cite (ACL):
Yajiao Liu, Congliang Chen, Junchi Yang, and Ruoyu Sun. 2026. Rethinking Data Mixture for Large Language Models: A Comprehensive Survey and New Perspectives. In Findings of the Association for Computational Linguistics: EACL 2026, pages 275–289, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Rethinking Data Mixture for Large Language Models: A Comprehensive Survey and New Perspectives (Liu et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.15.pdf
Checklist:
 2026.findings-eacl.15.checklist.pdf