MultiReflect: Multimodal Self-Reflective RAG-based Automated Fact-Checking

Uku Kangur, Krish Agrawal, Yashashvi Singh, Ahmed Sabir, Rajesh Sharma


Abstract
In this work, we introduce MultiReflect, a novel multimodal self-reflective Retrieval Augmented Generation (RAG)-based automated fact-checking pipeline. MultiReflect is designed to address the challenges of rapidly outdated information, limitations in human query capabilities, and expert knowledge barriers in fact-checking. Our proposed pipeline leverages the latest advancements in Large Language Models (LLMs) and Retrieval Augmented Generation (RAG) to enhance fact verification across text and images. Specifically, by integrating multimodal data processing with RAG-based evidence reflection, our system improves the accuracy of fact-checking by utilizing internet-sourced verification. We evaluate our results on the VERITE benchmarks and using several multimodal LLMs, outperforming baselines in binary classification.
Anthology ID:
2025.magmar-1.1
Volume:
Proceedings of the 1st Workshop on Multimodal Augmented Generation via Multimodal Retrieval (MAGMaR 2025)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Reno Kriz, Kenton Murray
Venues:
MAGMaR | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–17
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.magmar-1.1/
DOI:
Bibkey:
Cite (ACL):
Uku Kangur, Krish Agrawal, Yashashvi Singh, Ahmed Sabir, and Rajesh Sharma. 2025. MultiReflect: Multimodal Self-Reflective RAG-based Automated Fact-Checking. In Proceedings of the 1st Workshop on Multimodal Augmented Generation via Multimodal Retrieval (MAGMaR 2025), pages 1–17, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
MultiReflect: Multimodal Self-Reflective RAG-based Automated Fact-Checking (Kangur et al., MAGMaR 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.magmar-1.1.pdf