ASRank: Zero-Shot Re-Ranking with Answer Scent for Document Retrieval
Abdelrahman Abdallah, Jamshid Mozafari, Bhawna Piryani, Adam Jatowt
Abstract
Retrieval-Augmented Generation (RAG) models have drawn considerable attention in modern open-domain question answering. The effectiveness of RAG depends on the quality of the top retrieved documents. However, conventional retrieval methods sometimes fail to rank the most relevant documents at the top. In this paper, we introduce ASRANK, a new re-ranking method based on scoring retrieved documents using zero-shot answer scent which relies on a pre-trained large language model to compute the likelihood of the document-derived answers aligning with the answer scent. Our approach demonstrates marked improvements across several datasets, including NQ, TriviaQA, WebQA, ArchivalQA, HotpotQA, and Entity Questions. Notably, ASRANK increases Top-1 retrieval accuracy on NQ from 19.2% to 46.5% for MSS and 22.1% to 47.3% for BM25. It also shows strong retrieval performance on several datasets compared to state-of-the-art methods (47.3 Top-1 by ASRANK vs 35.4 by UPR by BM25).- Anthology ID:
- 2025.findings-naacl.161
- Volume:
- Findings of the Association for Computational Linguistics: NAACL 2025
- Month:
- April
- Year:
- 2025
- Address:
- Albuquerque, New Mexico
- Editors:
- Luis Chiruzzo, Alan Ritter, Lu Wang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2950–2970
- Language:
- URL:
- https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.findings-naacl.161/
- DOI:
- Cite (ACL):
- Abdelrahman Abdallah, Jamshid Mozafari, Bhawna Piryani, and Adam Jatowt. 2025. ASRank: Zero-Shot Re-Ranking with Answer Scent for Document Retrieval. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 2950–2970, Albuquerque, New Mexico. Association for Computational Linguistics.
- Cite (Informal):
- ASRank: Zero-Shot Re-Ranking with Answer Scent for Document Retrieval (Abdallah et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.findings-naacl.161.pdf