Improving Consistency for Text Summarization with Energy Functions

Qi Zeng, Qingyu Yin, Zheng Li, Yifan Gao, Sreyashi Nag, Zhengyang Wang, Bing Yin, Heng Ji, Chao Zhang


Abstract
Current abstractive summarization models often generate inconsistent content, i.e. texts that are not directly inferable from the source document, are not consistent with respect to world knowledge, or are self-contradictory. These inconsistencies motivate a new consistency taxonomy that we define as faithfulness, factuality, and self-supportiveness. However, most recent work on reducing inconsistency in document summarization only focuses on faithfulness detection and correction while ignoring other inconsistency phenomena, which limits the model’s scalability. To improve the general consistency we introduce EnergySum, where we apply the Residual Energy-based Model by designing energy scorers that reflect each type of consistency. These energy scores are utilized in candidate re-ranking during the sampling process. Experiments on XSUM and CNN/DM datasets show that EnergySum mitigates the trade-off between accuracy and consistency.
Anthology ID:
2023.findings-emnlp.798
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11925–11931
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.798
DOI:
10.18653/v1/2023.findings-emnlp.798
Bibkey:
Cite (ACL):
Qi Zeng, Qingyu Yin, Zheng Li, Yifan Gao, Sreyashi Nag, Zhengyang Wang, Bing Yin, Heng Ji, and Chao Zhang. 2023. Improving Consistency for Text Summarization with Energy Functions. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11925–11931, Singapore. Association for Computational Linguistics.
Cite (Informal):
Improving Consistency for Text Summarization with Energy Functions (Zeng et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2023.findings-emnlp.798.pdf