Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization

Tatsuya Ishigaki, Hidetaka Kamigaito, Hiroya Takamura, Manabu Okumura


Abstract
Discourse relations between sentences are often represented as a tree, and the tree structure provides important information for summarizers to create a short and coherent summary. However, current neural network-based summarizers treat the source document as just a sequence of sentences and ignore the tree-like discourse structure inherent in the document. To incorporate the information of a discourse tree structure into the neural network-based summarizers, we propose a discourse-aware neural extractive summarizer which can explicitly take into account the discourse dependency tree structure of the source document. Our discourse-aware summarizer can jointly learn the discourse structure and the salience score of a sentence by using novel hierarchical attention modules, which can be trained on automatically parsed discourse dependency trees. Experimental results showed that our model achieved competitive or better performances against state-of-the-art models in terms of ROUGE scores on the DailyMail dataset. We further conducted manual evaluations. The results showed that our approach also gained the coherence of the output summaries.
Anthology ID:
R19-1059
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
497–506
Language:
URL:
https://aclanthology.org/R19-1059
DOI:
10.26615/978-954-452-056-4_059
Bibkey:
Cite (ACL):
Tatsuya Ishigaki, Hidetaka Kamigaito, Hiroya Takamura, and Manabu Okumura. 2019. Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 497–506, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Discourse-Aware Hierarchical Attention Network for Extractive Single-Document Summarization (Ishigaki et al., RANLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/R19-1059.pdf