Evidence Retrieval is almost All You Need for Fact Verification
Liwen Zheng, Chaozhuo Li, Xi Zhang, Yu-Ming Shang, Feiran Huang, Haoran Jia
Abstract
Current fact verification methods generally follow the two-stage training paradigm: evidence retrieval and claim verification. While existing works focus on developing sophisticated claim verification modules, the fundamental importance of evidence retrieval is largely ignored. Existing approaches usually adopt the heuristic semantic similarity-based retrieval strategy, resulting in the task-irrelevant evidence and undesirable performance. In this paper, we concentrate on evidence retrieval and propose a Retrieval-Augmented Verification framework RAV, consisting of two major modules: the hybrid evidence retrieval and the joint fact verification. Hybrid evidence retrieval module incorporates an efficient retriever for preliminary pruning of candidate evidence, succeeded by a ranker that generates more precise sorting results. Under this end-to-end training paradigm, gradients from the claim verification can be back-propagated to enhance evidence selection. Experimental results on FEVER dataset demonstrate the superiority of RAV.- Anthology ID:
- 2024.findings-acl.551
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2024
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand
- Editors:
- Lun-Wei Ku, Andre Martins, Vivek Srikumar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 9274–9281
- Language:
- URL:
- https://aclanthology.org/2024.findings-acl.551
- DOI:
- 10.18653/v1/2024.findings-acl.551
- Cite (ACL):
- Liwen Zheng, Chaozhuo Li, Xi Zhang, Yu-Ming Shang, Feiran Huang, and Haoran Jia. 2024. Evidence Retrieval is almost All You Need for Fact Verification. In Findings of the Association for Computational Linguistics: ACL 2024, pages 9274–9281, Bangkok, Thailand. Association for Computational Linguistics.
- Cite (Informal):
- Evidence Retrieval is almost All You Need for Fact Verification (Zheng et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/autopr/2024.findings-acl.551.pdf