Abstract
Measuring and mitigating gender bias in natural language processing (NLP) systems is crucial to ensure fair and ethical AI. However, a key challenge is the lack of explicit gender information in many textual datasets. This paper proposes two techniques, Identity Term Sampling (ITS) and Identity Term Pattern Extraction (ITPE), as alternatives to template-based approaches for measuring gender bias in text data. These approaches identify test data for measuring gender bias in the dataset itself and can be used to measure gender bias on any NLP classifier. We demonstrate the use of these approaches for measuring gender bias across various NLP classification tasks, including hate speech detection, fake news identification, and sentiment analysis. Additionally, we show how these techniques can benefit gender bias mitigation, proposing a variant of Counterfactual Data Augmentation (CDA), called Gender-Selective CDA (GS-CDA), which reduces the amount of data augmentation required in training data while effectively mitigating gender bias and maintaining overall classification performance.- Anthology ID:
- 2024.gebnlp-1.10
- Volume:
- Proceedings of the 5th Workshop on Gender Bias in Natural Language Processing (GeBNLP)
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand
- Editors:
- Agnieszka Faleńska, Christine Basta, Marta Costa-jussà, Seraphina Goldfarb-Tarrant, Debora Nozza
- Venues:
- GeBNLP | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 167–178
- Language:
- URL:
- https://aclanthology.org/2024.gebnlp-1.10
- DOI:
- 10.18653/v1/2024.gebnlp-1.10
- Cite (ACL):
- Nasim Sobhani and Sarah Delany. 2024. Towards Fairer NLP Models: Handling Gender Bias In Classification Tasks. In Proceedings of the 5th Workshop on Gender Bias in Natural Language Processing (GeBNLP), pages 167–178, Bangkok, Thailand. Association for Computational Linguistics.
- Cite (Informal):
- Towards Fairer NLP Models: Handling Gender Bias In Classification Tasks (Sobhani & Delany, GeBNLP-WS 2024)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2024.gebnlp-1.10.pdf