Entropy as a Proxy for Gap Complexity in Open Cloze Tests

Mariano Felice, Paula Buttery


Abstract
This paper presents a pilot study of entropy as a measure of gap complexity in open cloze tests aimed at learners of English. Entropy is used to quantify the information content in each gap, which can be used to estimate complexity. Our study shows that average gap entropy correlates positively with proficiency levels while individual gap entropy can capture contextual complexity. To the best of our knowledge, this is the first unsupervised information-theoretical approach to evaluating the quality of cloze tests.
Anthology ID:
R19-1037
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
323–327
Language:
URL:
https://aclanthology.org/R19-1037
DOI:
10.26615/978-954-452-056-4_037
Bibkey:
Cite (ACL):
Mariano Felice and Paula Buttery. 2019. Entropy as a Proxy for Gap Complexity in Open Cloze Tests. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 323–327, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Entropy as a Proxy for Gap Complexity in Open Cloze Tests (Felice & Buttery, RANLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/R19-1037.pdf