Selective Encoding for Abstractive Sentence Summarization

Qingyu Zhou, Nan Yang, Furu Wei, Ming Zhou


Abstract
We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the information flow from encoder to decoder. The second level representation is tailored for sentence summarization task, which leads to better performance. We evaluate our model on the English Gigaword, DUC 2004 and MSR abstractive sentence summarization datasets. The experimental results show that the proposed selective encoding model outperforms the state-of-the-art baseline models.
Anthology ID:
P17-1101
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1095–1104
Language:
URL:
https://aclanthology.org/P17-1101
DOI:
10.18653/v1/P17-1101
Bibkey:
Cite (ACL):
Qingyu Zhou, Nan Yang, Furu Wei, and Ming Zhou. 2017. Selective Encoding for Abstractive Sentence Summarization. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1095–1104, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Selective Encoding for Abstractive Sentence Summarization (Zhou et al., ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/P17-1101.pdf
Video:
 https://vimeo.com/234956352
Code
 magic282/SEASS +  additional community code
Data
DUC 2004