Understanding the Extent to which Content Quality Metrics Measure the Information Quality of Summaries

Daniel Deutsch, Dan Roth


Abstract
Reference-based metrics such as ROUGE or BERTScore evaluate the content quality of a summary by comparing the summary to a reference. Ideally, this comparison should measure the summary’s information quality by calculating how much information the summaries have in common. In this work, we analyze the token alignments used by ROUGE and BERTScore to compare summaries and argue that their scores largely cannot be interpreted as measuring information overlap. Rather, they are better estimates of the extent to which the summaries discuss the same topics. Further, we provide evidence that this result holds true for many other summarization evaluation metrics. The consequence of this result is that the most frequently used summarization evaluation metrics do not align with the community’s research goal, to generate summaries with high-quality information. However, we conclude by demonstrating that a recently proposed metric, QAEval, which scores summaries using question-answering, appears to better capture information quality than current evaluations, highlighting a direction for future research.
Anthology ID:
2021.conll-1.24
Volume:
Proceedings of the 25th Conference on Computational Natural Language Learning
Month:
November
Year:
2021
Address:
Online
Editors:
Arianna Bisazza, Omri Abend
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
300–309
Language:
URL:
https://aclanthology.org/2021.conll-1.24
DOI:
10.18653/v1/2021.conll-1.24
Bibkey:
Cite (ACL):
Daniel Deutsch and Dan Roth. 2021. Understanding the Extent to which Content Quality Metrics Measure the Information Quality of Summaries. In Proceedings of the 25th Conference on Computational Natural Language Learning, pages 300–309, Online. Association for Computational Linguistics.
Cite (Informal):
Understanding the Extent to which Content Quality Metrics Measure the Information Quality of Summaries (Deutsch & Roth, CoNLL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2021.conll-1.24.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2021.conll-1.24.mp4