DebateQA: Evaluating Question Answering on Debatable Knowledge

Rongwu Xu, Xuan Qi, Zehan Qi, Wei Xu, Zhijiang Guo


Abstract
The rise of large language models (LLMs) has enabled us to seek answers to inherently debatable questions on LLM chatbots, necessitating a reliable way to evaluate their ability. However, traditional QA benchmarks assume fixed answers are inadequate for this purpose. To address this, we introduce DebateQA, a dataset of 2,941 debatable questions, each accompanied by multiple human-annotated partial answers that capture a variety of perspectives. We develop two metrics: Perspective Diversity, which evaluates the comprehensiveness of perspectives, and Dispute Awareness, which assesses if the LLM acknowledges the question’s debatable nature. Experiments demonstrate that both metrics are aligned with human preferences and stable across different underlying models. Using DebateQA with two metrics, we assess 12 prevalent LLMs and retrieval-augmented generation methods. Our findings reveal that while LLMs generally excel at recognizing debatable issues, their ability to provide comprehensive answers encompassing diverse perspectives varies considerably.
Anthology ID:
2026.findings-eacl.44
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
854–885
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.44/
DOI:
Bibkey:
Cite (ACL):
Rongwu Xu, Xuan Qi, Zehan Qi, Wei Xu, and Zhijiang Guo. 2026. DebateQA: Evaluating Question Answering on Debatable Knowledge. In Findings of the Association for Computational Linguistics: EACL 2026, pages 854–885, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
DebateQA: Evaluating Question Answering on Debatable Knowledge (Xu et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.44.pdf
Checklist:
 2026.findings-eacl.44.checklist.pdf