Abstract
Providing explanations for cloze questions in language assessment (LA) has been recognized as a valuable approach to enhancing the language proficiency of learners. However, there is a noticeable absence of dedicated tasks and datasets specifically designed for generating language learner explanations. In response to this gap, this paper introduces a novel task ClozEx of generating explanations for cloze questions in LA, with a particular focus on English as a Second Language (ESL) learners. To support this task, we present a meticulously curated dataset comprising cloze questions paired with corresponding explanations. This dataset aims to assess language proficiency and facilitates language learning by offering informative and accurate explanations. To tackle the task, we fine-tuned various baseline models with our training data, including encoder-decoder and decoder-only architectures. We also explored whether large language models (LLMs) are able to generate good explanations without fine-tuning, just using pre-defined prompts. The evaluation results demonstrate that encoder-decoder models have the potential to deliver fluent and valid explanations when trained on our dataset.- Anthology ID:
- 2023.findings-emnlp.347
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5228–5242
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.347
- DOI:
- 10.18653/v1/2023.findings-emnlp.347
- Cite (ACL):
- Zizheng Zhang, Masato Mita, and Mamoru Komachi. 2023. ClozEx: A Task toward Generation of English Cloze Explanation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 5228–5242, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- ClozEx: A Task toward Generation of English Cloze Explanation (Zhang et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.347.pdf