Neural Abstractive Multi-Document Summarization: Hierarchical or Flat Structure?

Ye Ma, Lu Zong


Abstract
With regards to WikiSum (CITATION) that empowers applicative explorations of Neural Multi-Document Summarization (MDS) to learn from large scale dataset, this study develops two hierarchical Transformers (HT) that describe both the cross-token and cross-document dependencies, at the same time allow extended length of input documents. By incorporating word- and paragraph-level multi-head attentions in the decoder based on the parallel and vertical architectures, the proposed parallel and vertical hierarchical Transformers (PHT &VHT) generate summaries utilizing context-aware word embeddings together with static and dynamics paragraph embeddings, respectively. A comprehensive evaluation is conducted on WikiSum to compare PHT &VHT with established models and to answer the question whether hierarchical structures offer more promising performances than flat structures in the MDS task. The results suggest that our hierarchical models generate summaries of higher quality by better capturing cross-document relationships, and save more memory spaces in comparison to flat-structure models. Moreover, we recommend PHT given its practical value of higher inference speed and greater memory-saving capacity.
Anthology ID:
2020.iwdp-1.6
Volume:
Proceedings of the Second International Workshop of Discourse Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Qun Liu, Deyi Xiong, Shili Ge, Xiaojun Zhang
Venue:
iwdp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29–37
Language:
URL:
https://aclanthology.org/2020.iwdp-1.6
DOI:
Bibkey:
Cite (ACL):
Ye Ma and Lu Zong. 2020. Neural Abstractive Multi-Document Summarization: Hierarchical or Flat Structure?. In Proceedings of the Second International Workshop of Discourse Processing, pages 29–37, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Neural Abstractive Multi-Document Summarization: Hierarchical or Flat Structure? (Ma & Zong, iwdp 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2020.iwdp-1.6.pdf
Data
WikiSum