Improving Abstractive Dialogue Summarization with Hierarchical Pretraining and Topic Segment

MengNan Qi, Hao Liu, YuZhuo Fu, Ting Liu


Abstract
With the increasing abundance of meeting transcripts, meeting summary has attracted more and more attention from researchers. The unsupervised pre-training method based on transformer structure combined with fine-tuning of downstream tasks has achieved great success in the field of text summarization. However, the semantic structure and style of meeting transcripts are quite different from that of articles. In this work, we propose a hierarchical transformer encoder-decoder network with multi-task pre-training. Specifically, we mask key sentences at the word-level encoder and generate them at the decoder. Besides, we randomly mask some of the role alignments in the input text and force the model to recover the original role tags to complete the alignments. In addition, we introduce a topic segmentation mechanism to further improve the quality of the generated summaries. The experimental results show that our model is superior to the previous methods in meeting summary datasets AMI and ICSI.
Anthology ID:
2021.findings-emnlp.97
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1121–1130
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.97
DOI:
10.18653/v1/2021.findings-emnlp.97
Bibkey:
Cite (ACL):
MengNan Qi, Hao Liu, YuZhuo Fu, and Ting Liu. 2021. Improving Abstractive Dialogue Summarization with Hierarchical Pretraining and Topic Segment. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1121–1130, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Abstractive Dialogue Summarization with Hierarchical Pretraining and Topic Segment (Qi et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.findings-emnlp.97.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.findings-emnlp.97.mp4