ExplainMeetSum: A Dataset for Explainable Meeting Summarization Aligned with Human Intent

Hyun Kim, Minsoo Cho, Seung-Hoon Na


Abstract
To enhance the explainability of meeting summarization, we construct a new dataset called “ExplainMeetSum,” an augmented version of QMSum, by newly annotating evidence sentences that faithfully “explain” a summary. Using ExplainMeetSum, we propose a novel multiple extractor guided summarization, namely Multi-DYLE, which extensively generalizes DYLE to enable using a supervised extractor based on human-aligned extractive oracles. We further present an explainability-aware task, named “Explainable Evidence Extraction” (E3), which aims to automatically detect all evidence sentences that support a given summary. Experimental results on the QMSum dataset show that the proposed Multi-DYLE outperforms DYLE with gains of up to 3.13 in the ROUGE-1 score. We further present the initial results on the E3 task, under the settings using separate and joint evaluation metrics.
Anthology ID:
2023.acl-long.731
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13079–13098
Language:
URL:
https://aclanthology.org/2023.acl-long.731
DOI:
10.18653/v1/2023.acl-long.731
Bibkey:
Cite (ACL):
Hyun Kim, Minsoo Cho, and Seung-Hoon Na. 2023. ExplainMeetSum: A Dataset for Explainable Meeting Summarization Aligned with Human Intent. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13079–13098, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
ExplainMeetSum: A Dataset for Explainable Meeting Summarization Aligned with Human Intent (Kim et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.731.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.731.mp4