Unsupervised Token-level Hallucination Detection from Summary Generation By-products

Andreas Marfurt, James Henderson


Abstract
Hallucinations in abstractive summarization are model generations that are unfaithful to the source document. Current methods for detecting hallucinations operate mostly on noun phrases and named entities, and restrict themselves to the XSum dataset, which is known to have hallucinations in 3 out of 4 training examples (Maynez et al., 2020). We instead consider the CNN/DailyMail dataset where the summarization model has not seen abnormally many hallucinations during training. We automatically detect candidate hallucinations at the token level, irrespective of its part of speech. Our detection comes essentially for free, as we only use information the model already produces during generation of the summary. This enables practitioners to jointly generate a summary and identify possible hallucinations, with minimal overhead. We repurpose an existing factuality dataset and create our own token-level annotations. The evaluation on these two datasets shows that our model achieves better precision-recall tradeoffs than its competitors, which additionally require a model forward pass.
Anthology ID:
2022.gem-1.21
Volume:
Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Venue:
GEM
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
248–261
Language:
URL:
https://aclanthology.org/2022.gem-1.21
DOI:
Bibkey:
Cite (ACL):
Andreas Marfurt and James Henderson. 2022. Unsupervised Token-level Hallucination Detection from Summary Generation By-products. In Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM), pages 248–261, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Unsupervised Token-level Hallucination Detection from Summary Generation By-products (Marfurt & Henderson, GEM 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.gem-1.21.pdf