LimRank: Less is More for Reasoning-Intensive Information Reranking

Tingyu Song, Yilun Zhao, Siyue Zhang, Chen Zhao, Arman Cohan


Abstract
Existing approaches typically rely on large-scale fine-tuning to adapt LLMs for information reranking tasks, which is computationally expensive. In this work, we demonstrate that modern LLMs can be effectively adapted using only minimal, high-quality supervision. To enable this, we design LIMRANK-SYNTHESIZER, a reusable and open-source pipeline for generating diverse, challenging, and realistic reranking examples. Using this synthetic data, we fine-tune our reranker model, LIMRANK. We evaluate LIMRANK on two challenging benchmarks, i.e., BRIGHT for reasoning-intensive retrieval and FollowIR for instruction-following retrieval. Our experiments demonstrate that LIMRANK achieves competitive performance, while being trained on less than 5% of the data typically used in prior work. Further ablation studies demonstrate the effectiveness of LIMRANK-SYNTHESIZER and the strong generalization capabilities of LIMRANK across downstream tasks, including scientific literature search and retrieval-augmented generation for knowledge-intensive problem solving.
Anthology ID:
2025.emnlp-main.1041
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20636–20650
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1041/
DOI:
Bibkey:
Cite (ACL):
Tingyu Song, Yilun Zhao, Siyue Zhang, Chen Zhao, and Arman Cohan. 2025. LimRank: Less is More for Reasoning-Intensive Information Reranking. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 20636–20650, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
LimRank: Less is More for Reasoning-Intensive Information Reranking (Song et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1041.pdf
Checklist:
 2025.emnlp-main.1041.checklist.pdf