Abstract
In this paper we show that GEC systems display gender bias related to the use of masculine and feminine terms and the gender-neutral singular “they”. We develop parallel datasets of texts with masculine and feminine terms, and singular “they”, and use them to quantify gender bias in three competitive GEC systems. We contribute a novel data augmentation technique for singular “they” leveraging linguistic insights about its distribution relative to plural “they”. We demonstrate that both this data augmentation technique and a refinement of a similar augmentation technique for masculine and feminine terms can generate training data that reduces bias in GEC systems, especially with respect to singular “they” while maintaining the same level of quality.- Anthology ID:
- 2023.bea-1.13
- Volume:
- Proceedings of the 18th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2023)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Ekaterina Kochmar, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Nitin Madnani, Anaïs Tack, Victoria Yaneva, Zheng Yuan, Torsten Zesch
- Venue:
- BEA
- SIG:
- SIGEDU
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 148–162
- Language:
- URL:
- https://aclanthology.org/2023.bea-1.13
- DOI:
- 10.18653/v1/2023.bea-1.13
- Cite (ACL):
- Gunnar Lund, Kostiantyn Omelianchuk, and Igor Samokhin. 2023. Gender-Inclusive Grammatical Error Correction through Augmentation. In Proceedings of the 18th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2023), pages 148–162, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Gender-Inclusive Grammatical Error Correction through Augmentation (Lund et al., BEA 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.bea-1.13.pdf