Abstract
Automatic summarization aims to extract important information from large amounts of textual data in order to create a shorter version of the original texts while preserving its information. Training traditional extractive summarization models relies heavily on human-engineered labels such as sentence-level annotations of summary-worthiness. However, in many use cases, such human-engineered labels do not exist and manually annotating thousands of documents for the purpose of training models may not be feasible. On the other hand, indirect signals for summarization are often available, such as agent actions for customer service dialogues, headlines for news articles, diagnosis for Electronic Health Records, etc. In this paper, we develop a general framework that generates extractive summarization as a byproduct of supervised learning tasks for indirect signals via the help of attention mechanism. We test our models on customer service dialogues and experimental results demonstrated that our models can reliably select informative sentences and words for automatic summarization.- Anthology ID:
- 2021.sigdial-1.54
- Volume:
- Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
- Month:
- July
- Year:
- 2021
- Address:
- Singapore and Online
- Editors:
- Haizhou Li, Gina-Anne Levow, Zhou Yu, Chitralekha Gupta, Berrak Sisman, Siqi Cai, David Vandyke, Nina Dethlefs, Yan Wu, Junyi Jessy Li
- Venue:
- SIGDIAL
- SIG:
- SIGDIAL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 520–529
- Language:
- URL:
- https://aclanthology.org/2021.sigdial-1.54
- DOI:
- 10.18653/v1/2021.sigdial-1.54
- Cite (ACL):
- Yingying Zhuang, Yichao Lu, and Simi Wang. 2021. Weakly Supervised Extractive Summarization with Attention. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 520–529, Singapore and Online. Association for Computational Linguistics.
- Cite (Informal):
- Weakly Supervised Extractive Summarization with Attention (Zhuang et al., SIGDIAL 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.sigdial-1.54.pdf