Don’t Forget the Base Retriever! A Low-Resource Graph-based Retriever for Multi-hop Question Answering

Andre Melo, Enting Chen, Pavlos Vougiouklis, Chenxin Diao, Shriram Piramanayagam, Ruofei Lai, Jeff Z. Pan


Abstract
Traditional Retrieval-augmented Generation systems struggle with complex multi-hop questions, which often require reasoning over multiple passages. While GraphRAG approaches address these challenges, most of them rely on expensive LLM calls. In this paper, we propose GR\small{IEVER}, a lightweight, low-resource, multi-step graph-based retriever for multi-hop QA. Unlike prior work, GR\small{IEVER} does not rely on LLMs and can perform multi-step retrieval in a few hundred milliseconds. It efficiently indexes passages alongside an associated knowledge graph and employs a hybrid retriever combined with aggressive filtering to reduce retrieval latency. Experiments on multi-hop QA datasets demonstrate that GR\small{IEVER} outperforms conventional retrievers and shows strong potential as a base retriever within multi-step agentic frameworks.
Anthology ID:
2025.emnlp-industry.174
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2564–2572
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.174/
DOI:
Bibkey:
Cite (ACL):
Andre Melo, Enting Chen, Pavlos Vougiouklis, Chenxin Diao, Shriram Piramanayagam, Ruofei Lai, and Jeff Z. Pan. 2025. Don’t Forget the Base Retriever! A Low-Resource Graph-based Retriever for Multi-hop Question Answering. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 2564–2572, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Don’t Forget the Base Retriever! A Low-Resource Graph-based Retriever for Multi-hop Question Answering (Melo et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.174.pdf