Neural Related Work Summarization with a Joint Context-driven Attention Mechanism

Yongzhen Wang, Xiaozhong Liu, Zheng Gao


Abstract
Conventional solutions to automatic related work summarization rely heavily on human-engineered features. In this paper, we develop a neural data-driven summarizer by leveraging the seq2seq paradigm, in which a joint context-driven attention mechanism is proposed to measure the contextual relevance within full texts and a heterogeneous bibliography graph simultaneously. Our motivation is to maintain the topic coherency between a related work section and its target document, where both the textual and graphic contexts play a big role in characterizing the relationship among scientific publications accurately. Experimental results on a large dataset show that our approach achieves a considerable improvement over a typical seq2seq summarizer and five classical summarization baselines.
Anthology ID:
D18-1204
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1776–1786
Language:
URL:
https://aclanthology.org/D18-1204
DOI:
10.18653/v1/D18-1204
Bibkey:
Cite (ACL):
Yongzhen Wang, Xiaozhong Liu, and Zheng Gao. 2018. Neural Related Work Summarization with a Joint Context-driven Attention Mechanism. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1776–1786, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Neural Related Work Summarization with a Joint Context-driven Attention Mechanism (Wang et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/D18-1204.pdf
Video:
 https://vimeo.com/305686976
Code
 kuadmu/2018EMNLP