Abstract
Traditional extractive summarization treats the task as sentence-level classification and requires a fixed number of sentences for extraction. However, this rigid constraint on the number of sentences to extract may hinder model generalization due to varied summary lengths across datasets. In this work, we leverage the interrelation between information extraction (IE) and text summarization, and introduce a fine-grained autoregressive method for extractive summarization through semantic tuple extraction. Specifically, we represent each sentence as a set of semantic tuples, where tuples are predicate-argument structures derived from conducting IE. Then we adopt a Transformer-based autoregressive model to extract the tuples corresponding to the target summary given a source document. In inference, a greedy approach is proposed to select source sentences to cover extracted tuples, eliminating the need for a fixed number. Our experiments on CNN/DM and NYT demonstrate the method’s superiority over strong baselines. Through the zero-shot setting for testing the generalization of models to diverse summary lengths across datasets, we further show our method outperforms baselines, including ChatGPT.- Anthology ID:
- 2024.inlg-main.10
- Volume:
- Proceedings of the 17th International Natural Language Generation Conference
- Month:
- September
- Year:
- 2024
- Address:
- Tokyo, Japan
- Editors:
- Saad Mahamood, Nguyen Le Minh, Daphne Ippolito
- Venue:
- INLG
- SIG:
- SIGGEN
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 121–133
- Language:
- URL:
- https://aclanthology.org/2024.inlg-main.10
- DOI:
- Cite (ACL):
- Yubin Ge, Sullam Jeoung, and Jana Diesner. 2024. Extractive Summarization via Fine-grained Semantic Tuple Extraction. In Proceedings of the 17th International Natural Language Generation Conference, pages 121–133, Tokyo, Japan. Association for Computational Linguistics.
- Cite (Informal):
- Extractive Summarization via Fine-grained Semantic Tuple Extraction (Ge et al., INLG 2024)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2024.inlg-main.10.pdf