Coreference-Aware Dialogue Summarization

Zhengyuan Liu, Ke Shi, Nancy Chen


Abstract
Summarizing conversations via neural approaches has been gaining research traction lately, yet it is still challenging to obtain practical solutions. Examples of such challenges include unstructured information exchange in dialogues, informal interactions between speakers, and dynamic role changes of speakers as the dialogue evolves. Many of such challenges result in complex coreference links. Therefore, in this work, we investigate different approaches to explicitly incorporate coreference information in neural abstractive dialogue summarization models to tackle the aforementioned challenges. Experimental results show that the proposed approaches achieve state-of-the-art performance, implying it is useful to utilize coreference information in dialogue summarization. Evaluation results on factual correctness suggest such coreference-aware models are better at tracing the information flow among interlocutors and associating accurate status/actions with the corresponding interlocutors and person mentions.
Anthology ID:
2021.sigdial-1.53
Original:
2021.sigdial-1.53v1
Version 2:
2021.sigdial-1.53v2
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
509–519
Language:
URL:
https://aclanthology.org/2021.sigdial-1.53
DOI:
Bibkey:
Cite (ACL):
Zhengyuan Liu, Ke Shi, and Nancy Chen. 2021. Coreference-Aware Dialogue Summarization. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 509–519, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
Coreference-Aware Dialogue Summarization (Liu et al., SIGDIAL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.sigdial-1.53.pdf
Video:
 https://www.youtube.com/watch?v=XNiUdhaW6LI
Code
 seq-to-mind/coref_dial_summ
Data
SAMSum Corpus