How Does Knowledge Selection Help Retrieval Augmented Generation?

Xiangci Li, Jessica Ouyang


Abstract
Retrieval-augmented generation (RAG) is a powerful method for enhancing natural language generation by integrating external knowledge into a model’s output. While prior work has demonstrated the importance of improving knowledge retrieval for boosting generation quality, the role of knowledge selection, a.k.a. reranking or filtering, remains less clear. This paper empirically analyzes how knowledge selection influences downstream generation performance in RAG systems. By simulating different retrieval and selection conditions through a controlled mixture of gold and distractor knowledge, we assess the impact of these factors on generation outcomes. Our findings indicate that the downstream generator model’s capability, as well as the complexity of the task and dataset, significantly influence the impact of knowledge selection on the overall RAG system performance. In typical scenarios, improving the knowledge recall score is key to enhancing generation outcomes, with the knowledge selector providing limited benefit when a strong generator model is used on clear, well-defined tasks. For weaker generator models or more ambiguous tasks and datasets, the knowledge F1 score becomes a critical factor, and the knowledge selector plays a more prominent role in improving overall performance.
Anthology ID:
2025.findings-emnlp.218
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4104–4121
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.218/
DOI:
10.18653/v1/2025.findings-emnlp.218
Bibkey:
Cite (ACL):
Xiangci Li and Jessica Ouyang. 2025. How Does Knowledge Selection Help Retrieval Augmented Generation?. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 4104–4121, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
How Does Knowledge Selection Help Retrieval Augmented Generation? (Li & Ouyang, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.218.pdf
Checklist:
 2025.findings-emnlp.218.checklist.pdf