Contextual Metric Meta-Evaluation by Measuring Local Metric Accuracy

Athiya Deviyani, Fernando Diaz


Abstract
Meta-evaluation of automatic evaluation metrics—assessing evaluation metrics themselves—is crucial for accurately benchmarking natural language processing systems and has implications for scientific inquiry, production model development, and policy enforcement. While existing approaches to metric meta-evaluation focus on general statements about the absolute and relative quality of metrics across arbitrary system outputs, in practice, metrics are applied in highly contextual settings, often measuring the performance for a highly constrained set of system outputs. For example, we may only be interested in evaluating a specific model or class of models. We introduce a method for contextual metric meta-evaluation by comparing the local metric accuracy of evaluation metrics. Across translation, speech recognition, and ranking tasks, we demonstrate that the local metric accuracies vary both in absolute value and relative effectiveness as we shift across evaluation contexts. This observed variation highlights the importance of adopting context-specific metric evaluations over global ones.
Anthology ID:
2025.findings-naacl.276
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4906–4925
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.276/
DOI:
Bibkey:
Cite (ACL):
Athiya Deviyani and Fernando Diaz. 2025. Contextual Metric Meta-Evaluation by Measuring Local Metric Accuracy. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 4906–4925, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Contextual Metric Meta-Evaluation by Measuring Local Metric Accuracy (Deviyani & Diaz, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.276.pdf