Abstract
Written language carries explicit and implicit biases that can distract from meaningful signals. For example, letters of reference may describe male and female candidates differently, or their writing style may indirectly reveal demographic characteristics. At best, such biases distract from the meaningful content of the text; at worst they can lead to unfair outcomes. We investigate the challenge of re-generating input sentences to ‘neutralize’ sensitive attributes while maintaining the semantic meaning of the original text (e.g. is the candidate qualified?). We propose a gradient-based rewriting framework, Detect and Perturb to Neutralize (DEPEN), that first detects sensitive components and masks them for regeneration, then perturbs the generation model at decoding time under a neutralizing constraint that pushes the (predicted) distribution of sensitive attributes towards a uniform distribution. Our experiments in two different scenarios show that DEPEN can regenerate fluent alternatives that are neutral in the sensitive attribute while maintaining the semantics of other attributes.- Anthology ID:
- 2021.findings-emnlp.352
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2021
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Venue:
- Findings
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4173–4181
- Language:
- URL:
- https://aclanthology.org/2021.findings-emnlp.352
- DOI:
- 10.18653/v1/2021.findings-emnlp.352
- Cite (ACL):
- Zexue He, Bodhisattwa Prasad Majumder, and Julian McAuley. 2021. Detect and Perturb: Neutral Rewriting of Biased and Sensitive Text via Gradient-based Decoding. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4173–4181, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Detect and Perturb: Neutral Rewriting of Biased and Sensitive Text via Gradient-based Decoding (He et al., Findings 2021)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2021.findings-emnlp.352.pdf
- Code
- zexuehe/depen