Abstract
There is a growing collection of work analyzing and mitigating societal biases in language understanding, generation, and retrieval tasks, though examining biases in creative tasks remains underexplored. Creative language applications are meant for direct interaction with users, so it is important to quantify and mitigate societal biases in these applications. We introduce a novel study on a pipeline to mitigate societal biases when retrieving next verse suggestions in a poetry composition system. Our results suggest that data augmentation through sentiment style transfer has potential for mitigating societal biases.- Anthology ID:
- 2020.gebnlp-1.9
- Volume:
- Proceedings of the Second Workshop on Gender Bias in Natural Language Processing
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Marta R. Costa-jussà, Christian Hardmeier, Will Radford, Kellie Webster
- Venue:
- GeBNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 93–106
- Language:
- URL:
- https://preview.aclanthology.org/icon-24-ingestion/2020.gebnlp-1.9/
- DOI:
- Cite (ACL):
- Emily Sheng and David Uthus. 2020. Investigating Societal Biases in a Poetry Composition System. In Proceedings of the Second Workshop on Gender Bias in Natural Language Processing, pages 93–106, Barcelona, Spain (Online). Association for Computational Linguistics.
- Cite (Informal):
- Investigating Societal Biases in a Poetry Composition System (Sheng & Uthus, GeBNLP 2020)
- PDF:
- https://preview.aclanthology.org/icon-24-ingestion/2020.gebnlp-1.9.pdf
- Code
- google-research-datasets/poem-sentiment
- Data
- Gutenberg Poem Dataset