Deciphering Stereotypes in Pre-Trained Language Models

Weicheng Ma, Henry Scheible, Brian Wang, Goutham Veeramachaneni, Pratim Chowdhary, Alan Sun, Andrew Koulogeorge, Lili Wang, Diyi Yang, Soroush Vosoughi


Abstract
Warning: This paper contains content that is stereotypical and may be upsetting. This paper addresses the issue of demographic stereotypes present in Transformer-based pre-trained language models (PLMs) and aims to deepen our understanding of how these biases are encoded in these models. To accomplish this, we introduce an easy-to-use framework for examining the stereotype-encoding behavior of PLMs through a combination of model probing and textual analyses. Our findings reveal that a small subset of attention heads within PLMs are primarily responsible for encoding stereotypes and that stereotypes toward specific minority groups can be identified using attention maps on these attention heads. Leveraging these insights, we propose an attention-head pruning method as a viable approach for debiasing PLMs, without compromising their language modeling capabilities or adversely affecting their performance on downstream tasks.
Anthology ID:
2023.emnlp-main.697
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11328–11345
Language:
URL:
https://aclanthology.org/2023.emnlp-main.697
DOI:
10.18653/v1/2023.emnlp-main.697
Bibkey:
Cite (ACL):
Weicheng Ma, Henry Scheible, Brian Wang, Goutham Veeramachaneni, Pratim Chowdhary, Alan Sun, Andrew Koulogeorge, Lili Wang, Diyi Yang, and Soroush Vosoughi. 2023. Deciphering Stereotypes in Pre-Trained Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 11328–11345, Singapore. Association for Computational Linguistics.
Cite (Informal):
Deciphering Stereotypes in Pre-Trained Language Models (Ma et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.697.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.697.mp4