REALM: Recursive Relevance Modeling for LLM-based Document Re-Ranking

Pinhuan Wang, Zhiqiu Xia, Chunhua Liao, Feiyi Wang, Hang Liu


Abstract
Large Language Models (LLMs) have shown strong capabilities in document re-ranking, a key component in modern Information Retrieval (IR) systems. However, existing LLM-based approaches face notable limitations, including ranking uncertainty, unstable top-k recovery, and high token cost due to token-intensive prompting. To effectively address these limitations, we propose REALM, an uncertainty-aware re-ranking framework that models LLM-derived relevance as Gaussian distributions and refines them through recursive Bayesian updates. By explicitly capturing uncertainty and minimizing redundant queries, REALM achieves better rankings more efficiently. Experimental results demonstrate that our REALM surpasses state-of-the-art re-rankers while significantly reducing token usage and latency, improving NDCG@10 by 0.7-11.9 and simultaneously reducing the number of LLM inferences by 23.4-84.4%, promoting it as the next-generation re-ranker for modern IR systems.
Anthology ID:
2025.emnlp-main.1218
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23875–23889
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1218/
DOI:
Bibkey:
Cite (ACL):
Pinhuan Wang, Zhiqiu Xia, Chunhua Liao, Feiyi Wang, and Hang Liu. 2025. REALM: Recursive Relevance Modeling for LLM-based Document Re-Ranking. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 23875–23889, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
REALM: Recursive Relevance Modeling for LLM-based Document Re-Ranking (Wang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1218.pdf
Checklist:
 2025.emnlp-main.1218.checklist.pdf