Examining the Cultural Encoding of Gender Bias in LLMs for Low-Resourced African Languages

Abigail Oppong, Hellina Hailu Nigatu, Chinasa T. Okolo


Abstract
Large Language Models (LLMs) are deployed in several aspects of everyday life. While the technology could have several benefits, like many socio-technical systems, it also encodes several biases. Trained on large, crawled datasets from the web, these models perpetuate stereotypes and regurgitate representational bias that is rampant in their training data. Languages encode gender in varying ways; some languages are grammatically gendered, while others do not. Bias in the languages themselves may also vary based on cultural, social, and religious contexts. In this paper, we investigate gender bias in LLMs by selecting two languages, Twi and Amharic. Twi is a non-gendered African language spoken in Ghana, while Amharic is a gendered language spoken in Ethiopia. Using these two languages on the two ends of the continent and their opposing grammatical gender system, we evaluate LLMs in three tasks: Machine Translation, Image Generation, and Sentence Completion. Our results give insights into the gender bias encoded in LLMs using two low-resourced languages and broaden the conversation on how culture and social structures play a role in disparate system performances.
Anthology ID:
2025.gebnlp-1.31
Volume:
Proceedings of the 6th Workshop on Gender Bias in Natural Language Processing (GeBNLP)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Agnieszka Faleńska, Christine Basta, Marta Costa-jussà, Karolina Stańczak, Debora Nozza
Venues:
GeBNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
358–378
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.gebnlp-1.31/
DOI:
10.18653/v1/2025.gebnlp-1.31
Bibkey:
Cite (ACL):
Abigail Oppong, Hellina Hailu Nigatu, and Chinasa T. Okolo. 2025. Examining the Cultural Encoding of Gender Bias in LLMs for Low-Resourced African Languages. In Proceedings of the 6th Workshop on Gender Bias in Natural Language Processing (GeBNLP), pages 358–378, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Examining the Cultural Encoding of Gender Bias in LLMs for Low-Resourced African Languages (Oppong et al., GeBNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.gebnlp-1.31.pdf