Abstract
Gender bias appears in many neural machine translation (NMT) models and commercial translation software. Research has become more aware of this problem in recent years and there has been work on mitigating gender bias. However, the challenge of addressing gender bias in NMT persists. This work utilizes a controlled text generation method, Future Discriminators for Generation (FUDGE), to reduce the so-called Speaking As gender bias. This bias emerges when translating from English to a language that openly marks the gender of the speaker. We evaluate the model on MuST-SHE, a challenge set to specifically evaluate gender translation. The results demonstrate improvements in the translation accuracy of the feminine terms.- Anthology ID:
- 2023.gitt-1.6
- Volume:
- Proceedings of the First Workshop on Gender-Inclusive Translation Technologies
- Month:
- June
- Year:
- 2023
- Address:
- Tampere, Finland
- Editors:
- Eva Vanmassenhove, Beatrice Savoldi, Luisa Bentivogli, Joke Daems, Janiça Hackenbuchner
- Venue:
- GITT
- SIG:
- Publisher:
- European Association for Machine Translation
- Note:
- Pages:
- 61–69
- Language:
- URL:
- https://aclanthology.org/2023.gitt-1.6
- DOI:
- Cite (ACL):
- Tianshuai Lu, Noëmi Aepli, and Annette Rios. 2023. Reducing Gender Bias in NMT with FUDGE. In Proceedings of the First Workshop on Gender-Inclusive Translation Technologies, pages 61–69, Tampere, Finland. European Association for Machine Translation.
- Cite (Informal):
- Reducing Gender Bias in NMT with FUDGE (Lu et al., GITT 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/2023.gitt-1.6.pdf