Adapting Bias Evaluation to Domain Contexts using Generative Models

Tamara Quiroga, Felipe Bravo-Marquez, Valentin Barriere


Abstract
Numerous datasets have been proposed to evaluate social bias in Natural Language Processing (NLP) systems. However, assessing bias within specific application domains remains challenging, as existing approaches often face limitations in scalability and fidelity across domains. In this work, we introduce a domain-adaptive framework that utilizes prompting with Large Language Models (LLMs) to automatically transform template-based bias datasets into domain-specific variants. We apply our method to two widely used benchmarks—Equity Evaluation Corpus (EEC) and Identity Phrase Templates Test Set (IPTTS)—adapting them to the Twitter and Wikipedia Talk data. Our results show that the adapted datasets yield bias estimates more closely aligned with real-world data. These findings highlight the potential of LLM-based prompting to enhance the realism and contextual relevance of bias evaluation in NLP systems.
Anthology ID:
2025.emnlp-main.1424
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28043–28054
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1424/
DOI:
Bibkey:
Cite (ACL):
Tamara Quiroga, Felipe Bravo-Marquez, and Valentin Barriere. 2025. Adapting Bias Evaluation to Domain Contexts using Generative Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 28043–28054, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Adapting Bias Evaluation to Domain Contexts using Generative Models (Quiroga et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1424.pdf
Checklist:
 2025.emnlp-main.1424.checklist.pdf