Mitigating Intrinsic Named Entity-Related Hallucinations of Abstractive Text Summarization

Jianbin Shen, Junyu Xuan, Christy Liang


Abstract
Abstractive text summarization (ATS) is both important and challenging. Recent studies have shown that ATS still faces various forms of hallucination. Our study also indicates that a significant portion of hallucinations is named entity-related. They might appear in different forms, such as mistaken entities and erroneous entity references. The underlying causes implicit in data are complex: data samples pose varying learning conditions. Despite recent research efforts dedicated to named entity-related hallucinations, the solutions have not adequately addressed the varying learning conditions posed by data. This paper aims to bridge the gap in pursuit of reducing intrinsic named entity-related hallucinations. To do so, we propose an adaptive margin ranking loss to facilitate two entity-alignment learning methods to tackle them. Our experiment results show that our methods improve the used baseline model on automatic evaluation scores. The human evaluation also indicates that our methods jointly reduce the intrinsic named entity-related hallucinations considerably compared to the used baseline model.
Anthology ID:
2023.findings-emnlp.1059
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15807–15824
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.1059
DOI:
10.18653/v1/2023.findings-emnlp.1059
Bibkey:
Cite (ACL):
Jianbin Shen, Junyu Xuan, and Christy Liang. 2023. Mitigating Intrinsic Named Entity-Related Hallucinations of Abstractive Text Summarization. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15807–15824, Singapore. Association for Computational Linguistics.
Cite (Informal):
Mitigating Intrinsic Named Entity-Related Hallucinations of Abstractive Text Summarization (Shen et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.1059.pdf