A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss

Wan-Ting Hsu, Chieh-Kai Lin, Ming-Ying Lee, Kerui Min, Jing Tang, Min Sun


Abstract
We propose a unified model combining the strength of extractive and abstractive summarization. On the one hand, a simple extractive model can obtain sentence-level attention with high ROUGE scores but less readable. On the other hand, a more complicated abstractive model can obtain word-level dynamic attention to generate a more readable paragraph. In our model, sentence-level attention is used to modulate the word-level attention such that words in less attended sentences are less likely to be generated. Moreover, a novel inconsistency loss function is introduced to penalize the inconsistency between two levels of attentions. By end-to-end training our model with the inconsistency loss and original losses of extractive and abstractive models, we achieve state-of-the-art ROUGE scores while being the most informative and readable summarization on the CNN/Daily Mail dataset in a solid human evaluation.
Anthology ID:
P18-1013
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
132–141
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/P18-1013/
DOI:
10.18653/v1/P18-1013
Bibkey:
Cite (ACL):
Wan-Ting Hsu, Chieh-Kai Lin, Ming-Ying Lee, Kerui Min, Jing Tang, and Min Sun. 2018. A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 132–141, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss (Hsu et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/P18-1013.pdf
Note:
 P18-1013.Notes.pdf
Presentation:
 P18-1013.Presentation.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/P18-1013.mp4
Data
CNN/Daily Mail