Abstract
Cloze multiple-choice questions (MCQs) are essential for assessing comprehension in educational settings, but manually designing effective distractors is time-consuming. Addressing this, recent research has automated distractor generation, yet such methods often neglect to adjust the difficulty level to the learner’s abilities, resulting in non-personalized assessments. This study introduces the Personalized Cloze Test Generation (PCGL) Framework, utilizing Large Language Models (LLMs) to generate cloze tests tailored to individual proficiency levels. Our PCGL Framework simplifies test creation by generating both question stems and distractors from a single input word and adjusts the difficulty to match the learner’s proficiency. The framework significantly reduces the effort in creating tests and enhances personalized learning by dynamically adjusting to the needs of each learner.- Anthology ID:
- 2024.inlg-main.26
- Volume:
- Proceedings of the 17th International Natural Language Generation Conference
- Month:
- September
- Year:
- 2024
- Address:
- Tokyo, Japan
- Editors:
- Saad Mahamood, Nguyen Le Minh, Daphne Ippolito
- Venue:
- INLG
- SIG:
- SIGGEN
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 314–319
- Language:
- URL:
- https://aclanthology.org/2024.inlg-main.26
- DOI:
- Cite (ACL):
- Chih-Hsuan Shen, Yi-Li Kuo, and Yao-Chung Fan. 2024. Personalized Cloze Test Generation with Large Language Models: Streamlining MCQ Development and Enhancing Adaptive Learning. In Proceedings of the 17th International Natural Language Generation Conference, pages 314–319, Tokyo, Japan. Association for Computational Linguistics.
- Cite (Informal):
- Personalized Cloze Test Generation with Large Language Models: Streamlining MCQ Development and Enhancing Adaptive Learning (Shen et al., INLG 2024)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2024.inlg-main.26.pdf