Predicting Difficulty and Discrimination of Natural Language Questions

Matthew Byrd, Shashank Srivastava


Abstract
Item Response Theory (IRT) has been extensively used to numerically characterize question difficulty and discrimination for human subjects in domains including cognitive psychology and education (Primi et al., 2014; Downing, 2003). More recently, IRT has been used to similarly characterize item difficulty and discrimination for natural language models across various datasets (Lalor et al., 2019; Vania et al., 2021; Rodriguez et al., 2021). In this work, we explore predictive models for directly estimating and explaining these traits for natural language questions in a question-answering context. We use HotpotQA for illustration. Our experiments show that it is possible to predict both difficulty and discrimination parameters for new questions, and these traits are correlated with features of questions, answers, and associated contexts. Our findings can have significant implications for the creation of new datasets and tests on the one hand and strategies such as active learning and curriculum learning on the other.
Anthology ID:
2022.acl-short.15
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
119–130
Language:
URL:
https://aclanthology.org/2022.acl-short.15
DOI:
10.18653/v1/2022.acl-short.15
Bibkey:
Cite (ACL):
Matthew Byrd and Shashank Srivastava. 2022. Predicting Difficulty and Discrimination of Natural Language Questions. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 119–130, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Predicting Difficulty and Discrimination of Natural Language Questions (Byrd & Srivastava, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2022.acl-short.15.pdf
Software:
 2022.acl-short.15.software.zip
Video:
 https://preview.aclanthology.org/add_acl24_videos/2022.acl-short.15.mp4
Data
HotpotQA