Hui-Juan Wang
2023
Distractor Generation based on Text2Text Language Models with Pseudo Kullback-Leibler Divergence Regulation
Hui-Juan Wang
|
Kai-Yu Hsieh
|
Han-Cheng Yu
|
Jui-Ching Tsou
|
Yu An Shih
|
Chen-Hua Huang
|
Yao-Chung Fan
Findings of the Association for Computational Linguistics: ACL 2023
In this paper, we address the task of cloze-style multiple choice question (MCQs) distractor generation. Our study is featured by the following designs. First, we propose to formulate the cloze distractor generation as a Text2Text task. Second, we propose pseudo Kullback-Leibler Divergence for regulating the generation to consider the item discrimination index in education evaluation. Third, we explore the candidate augmentation strategy and multi-tasking training with cloze-related tasks to further boost the generation performance. Through experiments with benchmarking datasets, our best perfomring model advances the state-of-the-art result from 10.81 to 22.00 (p@1 score).
Search
Co-authors
- Kai-Yu Hsieh 1
- Han-Cheng Yu 1
- Jui-Ching Tsou 1
- Yu An Shih 1
- Chen-Hua Huang 1
- show all...