A Lightweight Method to Generate Unanswerable Questions in English

Vagrant Gautam, Miaoran Zhang, Dietrich Klakow


Abstract
If a question cannot be answered with the available information, robust systems for question answering (QA) should know *not* to answer. One way to build QA models that do this is with additional training data comprised of unanswerable questions, created either by employing annotators or through automated methods for unanswerable question generation. To show that the model complexity of existing automated approaches is not justified, we examine a simpler data augmentation method for unanswerable question generation in English: performing antonym and entity swaps on answerable questions. Compared to the prior state-of-the-art, data generated with our training-free and lightweight strategy results in better models (+1.6 F1 points on SQuAD 2.0 data with BERT-large), and has higher human-judged relatedness and readability. We quantify the raw benefits of our approach compared to no augmentation across multiple encoder models, using different amounts of generated data, and also on TydiQA-MinSpan data (+9.3 F1 points with BERT-large). Our results establish swaps as a simple but strong baseline for future work.
Anthology ID:
2023.findings-emnlp.491
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7349–7360
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.491
DOI:
10.18653/v1/2023.findings-emnlp.491
Bibkey:
Cite (ACL):
Vagrant Gautam, Miaoran Zhang, and Dietrich Klakow. 2023. A Lightweight Method to Generate Unanswerable Questions in English. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7349–7360, Singapore. Association for Computational Linguistics.
Cite (Informal):
A Lightweight Method to Generate Unanswerable Questions in English (Gautam et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-emnlp.491.pdf