TabXEval: Why this is a Bad Table? An eXhaustive Rubric for Table Evaluation

Vihang Pancholi, Jainit Sushil Bafna, Tejas Anvekar, Manish Shrivastava, Vivek Gupta


Abstract
Evaluating tables qualitatively and quantitatively poses a significant challenge, as standard metrics often overlook subtle structural and content-level discrepancies. To address this, we propose a rubric-based evaluation framework that integrates multi-level structural descriptors with fine-grained contextual signals, enabling more precise and consistent table comparison. Building on this, we introduce TabXEval, an eXhaustive and eXplainable two-phase evaluation framework. TabXEval first aligns reference and predicted tables structurally via TabAlign, then performs semantic and syntactic comparison using TabCompare, offering interpretable and granular feedback. We evaluate TabXEval on TabXBench, a diverse, multi-domain benchmark featuring realistic table perturbations and human annotations. A sensitivity-specificity analysis further demonstrates the robustness and explainability of TabXEval across varied table tasks. Code and data are available at https://corallab- asu.github.io/tabxeval/.
Anthology ID:
2025.findings-acl.1176
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22913–22934
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.1176/
DOI:
10.18653/v1/2025.findings-acl.1176
Bibkey:
Cite (ACL):
Vihang Pancholi, Jainit Sushil Bafna, Tejas Anvekar, Manish Shrivastava, and Vivek Gupta. 2025. TabXEval: Why this is a Bad Table? An eXhaustive Rubric for Table Evaluation. In Findings of the Association for Computational Linguistics: ACL 2025, pages 22913–22934, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
TabXEval: Why this is a Bad Table? An eXhaustive Rubric for Table Evaluation (Pancholi et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.findings-acl.1176.pdf