Can Question Generation Debias Question Answering Models? A Case Study on Question–Context Lexical Overlap

Kazutoshi Shinoda, Saku Sugawara, Akiko Aizawa


Abstract
Question answering (QA) models for reading comprehension have been demonstrated to exploit unintended dataset biases such as question–context lexical overlap. This hinders QA models from generalizing to under-represented samples such as questions with low lexical overlap. Question generation (QG), a method for augmenting QA datasets, can be a solution for such performance degradation if QG can properly debias QA datasets. However, we discover that recent neural QG models are biased towards generating questions with high lexical overlap, which can amplify the dataset bias. Moreover, our analysis reveals that data augmentation with these QG models frequently impairs the performance on questions with low lexical overlap, while improving that on questions with high lexical overlap. To address this problem, we use a synonym replacement-based approach to augment questions with low lexical overlap. We demonstrate that the proposed data augmentation approach is simple yet effective to mitigate the degradation problem with only 70k synthetic examples.
Anthology ID:
2021.mrqa-1.6
Volume:
Proceedings of the 3rd Workshop on Machine Reading for Question Answering
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Venue:
MRQA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
63–72
Language:
URL:
https://aclanthology.org/2021.mrqa-1.6
DOI:
10.18653/v1/2021.mrqa-1.6
Bibkey:
Cite (ACL):
Kazutoshi Shinoda, Saku Sugawara, and Akiko Aizawa. 2021. Can Question Generation Debias Question Answering Models? A Case Study on Question–Context Lexical Overlap. In Proceedings of the 3rd Workshop on Machine Reading for Question Answering, pages 63–72, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Can Question Generation Debias Question Answering Models? A Case Study on Question–Context Lexical Overlap (Shinoda et al., MRQA 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.mrqa-1.6.pdf
Data
SQuAD