Scalable Data Synthesis through Human-like Cognitive Imitation and Data Recombination

Zhongyi Ye, Weitai Zhang, Xinyuan Zhou, Yongxin Zhu, Ninghui Rao, Enhong Chen


Abstract
Large language models (LLMs) rely on massive amounts of training data, however, the quantity of empirically observed data is limited. To alleviate this issue, lots of LLMs leverage synthetic data to enhance the quantity of training data. Despite significant advancements in LLMs, the efficiency and scalability characteristics of data synthesis during pre-training phases remain insufficiently explored. In this work, we propose a novel data synthesis framework, Cognitive Combination Synthesis (CCS), designed to achieve highly efficient and scalable data synthesis. Specifically, our methodology mimics human cognitive behaviors by recombining and interconnecting heterogeneous data from diverse sources thereby enhancing advanced reasoning capabilities in LLMs. Extensive experiments demonstrate that: (1) effective data organization is essential, and our mapping-based combination learning approach significantly improves data utilization efficiency; (2) by enhancing data diversity, accuracy, and complexity, our synthetic data scales beyond 100B tokens, revealing CCS’s strong scalability. Our findings highlight the impact of data organization methods on LLM learning efficiency and the significant potential of scalable synthetic data to enhance model reasoning capabilities.
Anthology ID:
2025.emnlp-main.236
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4721–4735
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.236/
DOI:
Bibkey:
Cite (ACL):
Zhongyi Ye, Weitai Zhang, Xinyuan Zhou, Yongxin Zhu, Ninghui Rao, and Enhong Chen. 2025. Scalable Data Synthesis through Human-like Cognitive Imitation and Data Recombination. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 4721–4735, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Scalable Data Synthesis through Human-like Cognitive Imitation and Data Recombination (Ye et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.236.pdf
Checklist:
 2025.emnlp-main.236.checklist.pdf