Fairness Beyond Performance: Revealing Reliability Disparities Across Groups in Legal NLP

Santosh T.y.s.s, Irtiza Chowdhury


Abstract
Fairness in NLP must extend beyond performance parity to encompass equitable reliability across groups. This study exposes a criticalblind spot: models often make less reliable or overconfident predictions for marginalized groups, even when overall performance appearsfair. Using the FairLex benchmark as a case study in legal NLP, we systematically evaluate both performance and reliability dispari-ties across demographic, regional, and legal attributes spanning four jurisdictions. We show that domain-specific pre-training consistentlyimproves both performance and reliability, especially for underrepresented groups. However, common bias mitigation methods frequentlyworsen reliability disparities, revealing a trade-off not captured by performance metrics alone. Our results call for a rethinking of fairnessin high-stakes NLP: To ensure equitable treatment, models must not only be accurate, but also reliably self-aware across all groups.
Anthology ID:
2025.acl-long.1188
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24376–24390
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1188/
DOI:
Bibkey:
Cite (ACL):
Santosh T.y.s.s and Irtiza Chowdhury. 2025. Fairness Beyond Performance: Revealing Reliability Disparities Across Groups in Legal NLP. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 24376–24390, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Fairness Beyond Performance: Revealing Reliability Disparities Across Groups in Legal NLP (T.y.s.s & Chowdhury, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1188.pdf