Abstract
Based on the remarkable achievements of pre-trained language models in abstractive summarization, the copying mechanism has proved helpful by improving the factuality, stability, and overall performance. This work proposes PROM, a new PhRase-level cOpying Mechanism that enhances attention on n-grams, which can be applied to zero-shot summarization with pre-training. PROM adds an indicator layer to explicitly pick up tokens in n-gram that can be copied from the source, and calculates an auxiliary loss for the copying prediction. Empirical studies show that PROM makes significant improvements in fine-tuning on benchmarks. In the zero-shot setting, PROM is utilized in the self-supervised pre-training on raw corpora and provides new general baselines on a wide range of summarization datasets. Further analysis shows that PROM performs more reasonable copying and contributes to faithfulness. Our code is publicly available at https://github.com/xbmxb/PROM.- Anthology ID:
- 2024.lrec-main.1148
- Volume:
- Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
- Month:
- May
- Year:
- 2024
- Address:
- Torino, Italia
- Editors:
- Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
- Venues:
- LREC | COLING
- SIG:
- Publisher:
- ELRA and ICCL
- Note:
- Pages:
- 13103–13119
- Language:
- URL:
- https://preview.aclanthology.org/remove-affiliations/2024.lrec-main.1148/
- DOI:
- Cite (ACL):
- Xinbei Ma, Yeyun Gong, Pengcheng He, Hai Zhao, and Nan Duan. 2024. PROM: A Phrase-level Copying Mechanism with Pre-training for Abstractive Summarization. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13103–13119, Torino, Italia. ELRA and ICCL.
- Cite (Informal):
- PROM: A Phrase-level Copying Mechanism with Pre-training for Abstractive Summarization (Ma et al., LREC-COLING 2024)
- PDF:
- https://preview.aclanthology.org/remove-affiliations/2024.lrec-main.1148.pdf