Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization

Jingyi You, Dongyuan Li, Hidetaka Kamigaito, Kotaro Funakoshi, Manabu Okumura


Abstract
Previous studies on the timeline summarization (TLS) task ignored the information interaction between sentences and dates, and adopted pre-defined unlearnable representations for them. They also considered date selection and event detection as two independent tasks, which makes it impossible to integrate their advantages and obtain a globally optimal summary. In this paper, we present a joint learning-based heterogeneous graph attention network for TLS (HeterTls), in which date selection and event detection are combined into a unified framework to improve the extraction accuracy and remove redundant sentences simultaneously. Our heterogeneous graph involves multiple types of nodes, the representations of which are iteratively learned across the heterogeneous graph attention layer. We evaluated our model on four datasets, and found that it significantly outperformed the current state-of-the-art baselines with regard to ROUGE scores and date selection metrics.
Anthology ID:
2022.naacl-main.301
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4091–4104
Language:
URL:
https://aclanthology.org/2022.naacl-main.301
DOI:
10.18653/v1/2022.naacl-main.301
Bibkey:
Cite (ACL):
Jingyi You, Dongyuan Li, Hidetaka Kamigaito, Kotaro Funakoshi, and Manabu Okumura. 2022. Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4091–4104, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization (You et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.naacl-main.301.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2022.naacl-main.301.mp4