These are not the Stereotypes You are Looking For: Bias and Fairness in Authorial Gender Attribution

Corina Koolen, Andreas van Cranenburgh


Abstract
Stylometric and text categorization results show that author gender can be discerned in texts with relatively high accuracy. However, it is difficult to explain what gives rise to these results and there are many possible confounding factors, such as the domain, genre, and target audience of a text. More fundamentally, such classification efforts risk invoking stereotyping and essentialism. We explore this issue in two datasets of Dutch literary novels, using commonly used descriptive (LIWC, topic modeling) and predictive (machine learning) methods. Our results show the importance of controlling for variables in the corpus and we argue for taking care not to overgeneralize from the results.
Anthology ID:
W17-1602
Volume:
Proceedings of the First ACL Workshop on Ethics in Natural Language Processing
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Dirk Hovy, Shannon Spruit, Margaret Mitchell, Emily M. Bender, Michael Strube, Hanna Wallach
Venue:
EthNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–22
Language:
URL:
https://aclanthology.org/W17-1602
DOI:
10.18653/v1/W17-1602
Bibkey:
Cite (ACL):
Corina Koolen and Andreas van Cranenburgh. 2017. These are not the Stereotypes You are Looking For: Bias and Fairness in Authorial Gender Attribution. In Proceedings of the First ACL Workshop on Ethics in Natural Language Processing, pages 12–22, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
These are not the Stereotypes You are Looking For: Bias and Fairness in Authorial Gender Attribution (Koolen & van Cranenburgh, EthNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/W17-1602.pdf