Optimizing Pre-Training Data Mixtures with Mixtures of Data Expert Models

Lior Belenki, Alekh Agarwal, Tianze Shi, Kristina Toutanova


Abstract
We propose a method to optimize language model pre-training data mixtures through efficient approximation of the cross-entropy loss corresponding to each candidate mixture via a Mixture of Data Experts (MDE). We use this approximation as a source of additional features in a regression model, trained from observations of model loss for a small number of mixtures. Experiments with Transformer decoder-only language models in the range of 70M to 10B parameters on the SlimPajama dataset show that our method achieves significantly better performance than approaches that train regression models using only the mixture rates as input features. Combining this improved optimization method with an objective that takes into account cross-entropy on end task data leads to superior performance on few-shot downstream evaluations. We also provide theoretical insights on why aggregation of data expert predictions can provide good approximations to model losses for data mixtures.
Anthology ID:
2025.acl-long.1564
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32570–32587
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1564/
DOI:
Bibkey:
Cite (ACL):
Lior Belenki, Alekh Agarwal, Tianze Shi, and Kristina Toutanova. 2025. Optimizing Pre-Training Data Mixtures with Mixtures of Data Expert Models. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 32570–32587, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Optimizing Pre-Training Data Mixtures with Mixtures of Data Expert Models (Belenki et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1564.pdf