Investigating the Zone of Proximal Development of Language Models for In-Context Learning

Peng Cui, Mrinmaya Sachan


Abstract
In this paper, we introduce a learning analytics framework to analyze the in-context learning (ICL) behavior of large language models (LLMs) through the lens of the Zone of Proximal Development (ZPD), an established theory in educational psychology. ZPD delineates the range of tasks a learner can accomplish with appropriate guidance but not yet independently. We adapt this concept to ICL, measuring the ZPD of LLMs based on model performance on individual examples in different settings. Furthermore, we propose an item response theory (IRT) model to predict the distribution of zones for LLMs. Our findings reveal a series of intricate and multifaceted behaviors of ICL, providing new insights into understanding and leveraging this technique. Finally, we demonstrate how our framework can enhance LLM in both inference and fine-tuning scenarios: (1) By predicting a model’s zone distribution, we selectively apply ICL to queries that are most likely to benefit from demonstrations, achieving a better balance between inference cost and performance; (2) We propose a human-like curriculum for fine-tuning, which prioritizes examples within the model’s ZPD. The curriculum results in improved performance, and we explain its effectiveness through an analysis of the training dynamics of LLMs.
Anthology ID:
2025.findings-naacl.362
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6470–6483
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.362/
DOI:
Bibkey:
Cite (ACL):
Peng Cui and Mrinmaya Sachan. 2025. Investigating the Zone of Proximal Development of Language Models for In-Context Learning. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 6470–6483, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Investigating the Zone of Proximal Development of Language Models for In-Context Learning (Cui & Sachan, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.362.pdf