Read before Generate! Faithful Long Form Question Answering with Machine Reading

Dan Su, Xiaoguang Li, Jindi Zhang, Lifeng Shang, Xin Jiang, Qun Liu, Pascale Fung


Abstract
Long-form question answering (LFQA) aims to generate a paragraph-length answer for a given question. While current work on LFQA using large pre-trained model for generation are effective at producing fluent and somewhat relevant content, one primary challenge lies in how to generate a faithful answer that has less hallucinated content. We propose a new end-to-end framework that jointly models answer generation and machine reading. The key idea is to augment the generation model with fine-grained, answer-related salient information which can be viewed as an emphasis on faithful facts. State-of-the-art results on two LFQA datasets, ELI5 and MS MARCO, demonstrate the effectiveness of our method, in comparison with strong baselines on automatic and human evaluation metrics. A detailed analysis further proves the competency of our methods in generating fluent, relevant, and more faithful answers.
Anthology ID:
2022.findings-acl.61
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
744–756
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.findings-acl.61/
DOI:
10.18653/v1/2022.findings-acl.61
Bibkey:
Cite (ACL):
Dan Su, Xiaoguang Li, Jindi Zhang, Lifeng Shang, Xin Jiang, Qun Liu, and Pascale Fung. 2022. Read before Generate! Faithful Long Form Question Answering with Machine Reading. In Findings of the Association for Computational Linguistics: ACL 2022, pages 744–756, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Read before Generate! Faithful Long Form Question Answering with Machine Reading (Su et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.findings-acl.61.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2022.findings-acl.61.mp4
Data
ELI5HotpotQAKILTMS MARCONatural Questions