Stereotypes and Smut: The (Mis)representation of Non-cisgender Identities by Text-to-Image Models

Eddie Ungless, Bjorn Ross, Anne Lauscher


Abstract
Cutting-edge image generation has been praised for producing high-quality images, suggesting a ubiquitous future in a variety of applications. However, initial studies have pointed to the potential for harm due to predictive bias, reflecting and potentially reinforcing cultural stereotypes. In this work, we are the first to investigate how multimodal models handle diverse gender identities. Concretely, we conduct a thorough analysis in which we compare the output of three image generation models for prompts containing cisgender vs. non-cisgender identity terms. Our findings demonstrate that certain non-cisgender identities are consistently (mis)represented as less human, more stereotyped and more sexualised. We complement our experimental analysis with (a) a survey among non-cisgender individuals and (b) a series of interviews, to establish which harms affected individuals anticipate, and how they would like to be represented. We find respondents are particularly concerned about misrepresentation, and the potential to drive harmful behaviours and beliefs. Simple heuristics to limit offensive content are widely rejected, and instead respondents call for community involvement, curated training data and the ability to customise. These improvements could pave the way for a future where change is led by the affected community, and technology is used to positively ”[portray] queerness in ways that we haven’t even thought of”’ rather than reproducing stale, offensive stereotypes.
Anthology ID:
2023.findings-acl.502
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7919–7942
Language:
URL:
https://aclanthology.org/2023.findings-acl.502
DOI:
10.18653/v1/2023.findings-acl.502
Bibkey:
Cite (ACL):
Eddie Ungless, Bjorn Ross, and Anne Lauscher. 2023. Stereotypes and Smut: The (Mis)representation of Non-cisgender Identities by Text-to-Image Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7919–7942, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Stereotypes and Smut: The (Mis)representation of Non-cisgender Identities by Text-to-Image Models (Ungless et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.findings-acl.502.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2023.findings-acl.502.mp4