RaDeR: Reasoning-aware Dense Retrieval Models

Debrup Das, Sam O’Nuallain, Razieh Rahimi


Abstract
We propose RaDeR, a set of reasoning-based dense retrieval models trained with data derived from mathematical problem solving using large language models (LLMs). Our method leverages retrieval-augmented reasoning trajectories of an LLM and self-reflective relevance evaluation, enabling the creation of both diverse and hard-negative samples for reasoning-intensive relevance. RaDeR retrievers, trained for mathematical reasoning, effectively generalize to diverse reasoning tasks in the BRIGHT and RAR-b benchmarks, consistently outperforming strong baselines in overall performance. Notably, RaDeR achieves significantly higher performance than baselines on the Math and Coding splits. In addition, RaDeR presents the first dense retriever that outperforms BM25 when queries are Chain-of-Thought reasoning steps, underscoring the critical role of reasoning-based retrieval to augment reasoning language models. Furthermore RaDeR achieves comparable or superior performance while using only 2.5% of the training data used by the concurrent work ReasonIR, highlighting the quality of our synthesized training data. Our code, data, and retrieval models are publicly available.
Anthology ID:
2025.emnlp-main.1011
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
19981–20008
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1011/
DOI:
Bibkey:
Cite (ACL):
Debrup Das, Sam O’Nuallain, and Razieh Rahimi. 2025. RaDeR: Reasoning-aware Dense Retrieval Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 19981–20008, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
RaDeR: Reasoning-aware Dense Retrieval Models (Das et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1011.pdf
Checklist:
 2025.emnlp-main.1011.checklist.pdf