Abstract
Factual accuracy is an important property of neural abstractive summarization models, especially in fact-critical domains such as the clinical literature. In this work, we introduce a guided continued pre-training stage for encoder-decoder models that improves their understanding of the factual attributes of documents, which is followed by supervised fine-tuning on summarization. Our approach extends the pre-training recipe of BART to incorporate 3 additional objectives based on PICO spans, which capture the population, intervention, comparison, and outcomes related to a clinical study. Experiments on multi-document summarization in the clinical domain demonstrate that our approach is competitive with prior work, improving the quality and factuality of the summaries and achieving the best-published results in factual accuracy on the MSLR task.- Anthology ID:
- 2024.naacl-short.66
- Volume:
- Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Kevin Duh, Helena Gomez, Steven Bethard
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 755–761
- Language:
- URL:
- https://aclanthology.org/2024.naacl-short.66
- DOI:
- Cite (ACL):
- Ahmed Elhady, Khaled Elsayed, Eneko Agirre, and Mikel Artetxe. 2024. Improving Factuality in Clinical Abstractive Multi-Document Summarization by Guided Continued Pre-training. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 755–761, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Improving Factuality in Clinical Abstractive Multi-Document Summarization by Guided Continued Pre-training (Elhady et al., NAACL 2024)
- PDF:
- https://preview.aclanthology.org/fix-volume-bibkeys/2024.naacl-short.66.pdf