Abstract
The goal of question answering (QA) is to answer _any_ question. However, major QA datasets have skewed distributions over gender, profession, and nationality. Despite that skew, an analysis of model accuracy reveals little evidence that accuracy is lower for people based on gender or nationality; instead, there is more variation on professions (question topic) and question ambiguity. But QA’s lack of representation could itself hide evidence of bias, necessitating QA datasets that better represent global diversity.- Anthology ID:
- 2021.emnlp-main.444
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5457–5473
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.444
- DOI:
- 10.18653/v1/2021.emnlp-main.444
- Cite (ACL):
- Maharshi Gor, Kellie Webster, and Jordan Boyd-Graber. 2021. Toward Deconfounding the Effect of Entity Demographics for Question Answering Accuracy. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5457–5473, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Toward Deconfounding the Effect of Entity Demographics for Question Answering Accuracy (Gor et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2021.emnlp-main.444.pdf
- Data
- Natural Questions, SQuAD, TriviaQA