Simulating Student Interactions for Virtual Pretesting with In-Context Learning

Arthur Thuy, Luca Benedetto, Ekaterina Loginova, Dries F. Benoit


Abstract
Recent research has experimented with using Large Language Models (LLMs) for simulating student responses to exam questions. This approach, known as virtual pretesting, potentially offers a scalable alternative to traditional pretesting, which is costly and time-intensive, by enabling the creation of datasets of virtual students’ responses. Prior studies focused on zero-shot role-playing, prompting one LLM to imitate students of different levels, but showed limited alignment with response patterns of real students. This work introduces a framework that improves the alignment of LLM-based student simulations through in-context learning (ICL), leveraging previous question-answer records to provide the model with richer information about students’ skills and misconceptions. Our experiments show that not all models can leverage the additional contextual information. However, a multi-model approach, which combines simulations from several models, significantly improves alignment of the simulated responses when provided with relevant context: we observe a reduction of up to 30% in difficulty estimation RMSE with respect to the non contextual and individual contextual models. Overall, our findings indicate that LLMs can be used with ICL to create synthetic datasets of student responses approximating some patterns of learner behavior, however their ability to align with authentic student performance remains limited.
Anthology ID:
2026.lrec-main.815
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
10376–10389
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.815/
DOI:
Bibkey:
Cite (ACL):
Arthur Thuy, Luca Benedetto, Ekaterina Loginova, and Dries F. Benoit. 2026. Simulating Student Interactions for Virtual Pretesting with In-Context Learning. International Conference on Language Resources and Evaluation, main:10376–10389.
Cite (Informal):
Simulating Student Interactions for Virtual Pretesting with In-Context Learning (Thuy et al., LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.815.pdf