Fair Text Classification with Wasserstein Independence

Thibaud Leteno, Antoine Gourru, Charlotte Laclau, Rémi Emonet, Christophe Gravier


Abstract
Group fairness is a central research topic in text classification, where reaching fair treatment between sensitive groups (e.g. women vs. men) remains an open challenge. This paper presents a novel method for mitigating biases in neural text classification, agnostic to the model architecture. Considering the difficulty to distinguish fair from unfair information in a text encoder, we take inspiration from adversarial training to induce Wasserstein independence between representations learned to predict our target label and the ones learned to predict some sensitive attribute. Our approach provides two significant advantages. Firstly, it does not require annotations of sensitive attributes in both testing and training data. This is more suitable for real-life scenarios compared to existing methods that require annotations of sensitive attributes at train time. Secondly, our approach exhibits a comparable or better fairness-accuracy trade-off compared to existing methods.
Anthology ID:
2023.emnlp-main.978
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15790–15803
Language:
URL:
https://aclanthology.org/2023.emnlp-main.978
DOI:
10.18653/v1/2023.emnlp-main.978
Bibkey:
Cite (ACL):
Thibaud Leteno, Antoine Gourru, Charlotte Laclau, Rémi Emonet, and Christophe Gravier. 2023. Fair Text Classification with Wasserstein Independence. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15790–15803, Singapore. Association for Computational Linguistics.
Cite (Informal):
Fair Text Classification with Wasserstein Independence (Leteno et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.978.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-main.978.mp4