Breaking the Reasoning Barrier A Survey on LLM Complex Reasoning through the Lens of Self-Evolution

Tao He, Hao Li, Jingchang Chen, Runxuan Liu, Yixin Cao, Lizi Liao, Zihao Zheng, Zheng Chu, Jiafeng Liang, Ming Liu, Bing Qin


Abstract
The release of OpenAI’s O1 and subsequent projects like DeepSeek R1 has significantly advanced research on complex reasoning in LLMs. This paper systematically analyzes existing reasoning studies from the perspective of self-evolution, structured into three components: data evolution, model evolution, and self-evolution. Data evolution explores methods to generate higher-quality reasoning training data. Model evolution focuses on training strategies to boost reasoning capabilities. Self-evolution research autonomous system evolution via iterating cycles of data and model evolution. We further discuss the scaling law of self-evolution and analyze representative O1-like works through this lens. By summarizing advanced methods and outlining future directions, this paper aims to drive advancements in LLMs’ reasoning abilities.
Anthology ID:
2025.findings-acl.386
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7377–7417
Language:
URL:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.386/
DOI:
Bibkey:
Cite (ACL):
Tao He, Hao Li, Jingchang Chen, Runxuan Liu, Yixin Cao, Lizi Liao, Zihao Zheng, Zheng Chu, Jiafeng Liang, Ming Liu, and Bing Qin. 2025. Breaking the Reasoning Barrier A Survey on LLM Complex Reasoning through the Lens of Self-Evolution. In Findings of the Association for Computational Linguistics: ACL 2025, pages 7377–7417, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Breaking the Reasoning Barrier A Survey on LLM Complex Reasoning through the Lens of Self-Evolution (He et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.386.pdf