ClaimCheck: Automatic Fact-Checking of Textual Claims using Web Evidence

Akshith Reddy Putta, Jacob Devasier, Chengkai Li


Abstract
We introduce ClaimCheck, an efficient fact-checking system that verifies textual claims using smaller, open-source large language models. ClaimCheck integrates two fact-checking strategies, claim-matching and novel claim processing. Claim-matching uses related fact-checks from trusted organizations to fact-check a claim. Novel claim processing breaks down fact-checking into manageable subtasks—generating targeted questions, retrieving Web evidence, extracting answers, and synthesizing verdicts. Evaluation on the AVeriTeC benchmark demonstrates 62.6% verdict prediction accuracy, with claim-matching providing a 2.8% improvement. ClaimCheck approaches the performance of state-of-the-art systems while requiring significantly fewer computational resources, demonstrating the effectiveness of using small language models for fact-checking tasks. Furthermore, our code is publicly available to help make automated fact-checking more accessible.
Anthology ID:
2025.knowledgenlp-1.26
Volume:
Proceedings of the 4th International Workshop on Knowledge-Augmented Methods for Natural Language Processing
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico, USA
Editors:
Weijia Shi, Wenhao Yu, Akari Asai, Meng Jiang, Greg Durrett, Hannaneh Hajishirzi, Luke Zettlemoyer
Venues:
KnowledgeNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
303–316
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.knowledgenlp-1.26/
DOI:
Bibkey:
Cite (ACL):
Akshith Reddy Putta, Jacob Devasier, and Chengkai Li. 2025. ClaimCheck: Automatic Fact-Checking of Textual Claims using Web Evidence. In Proceedings of the 4th International Workshop on Knowledge-Augmented Methods for Natural Language Processing, pages 303–316, Albuquerque, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
ClaimCheck: Automatic Fact-Checking of Textual Claims using Web Evidence (Putta et al., KnowledgeNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.knowledgenlp-1.26.pdf