HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization

Ye Liu, Jianguo Zhang, Yao Wan, Congying Xia, Lifang He, Philip Yu


Abstract
To capture the semantic graph structure from raw text, most existing summarization approaches are built on GNNs with a pre-trained model. However, these methods suffer from cumbersome procedures and inefficient computations for long-text documents. To mitigate these issues, this paper proposes HetFormer, a Transformer-based pre-trained model with multi-granularity sparse attentions for long-text extractive summarization. Specifically, we model different types of semantic nodes in raw text as a potential heterogeneous graph and directly learn heterogeneous relationships (edges) among nodes by Transformer. Extensive experiments on both single- and multi-document summarization tasks show that HetFormer achieves state-of-the-art performance in Rouge F1 while using less memory and fewer parameters.
Anthology ID:
2021.emnlp-main.13
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
146–154
Language:
URL:
https://aclanthology.org/2021.emnlp-main.13
DOI:
10.18653/v1/2021.emnlp-main.13
Bibkey:
Cite (ACL):
Ye Liu, Jianguo Zhang, Yao Wan, Congying Xia, Lifang He, and Philip Yu. 2021. HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 146–154, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization (Liu et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.emnlp-main.13.pdf
Software:
 2021.emnlp-main.13.Software.zip
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.emnlp-main.13.mp4
Code
 yeliu918/hetformer