Dingfu Yu
2025
Semantic Contribution-Aware Adaptive Retrieval for Black-Box Models
Qinhong Lin
|
Zhongliang Yang
|
Yuang Cai
|
Dingfu Yu
|
Xuan Xu
|
Yu Li
|
Linna Zhou
Findings of the Association for Computational Linguistics: EMNLP 2025
Retrieval-Augmented Generation (RAG) plays a critical role in mitigating hallucinations and improving factual accuracy for Large Language Models (LLMs). While dynamic retrieval techniques aim to determine retrieval timing and content based on model intrinsic needs, existing approaches struggle to generalize effectively in black-box model scenarios. To address this limitation, we propose the Semantic Contribution-Aware Adaptive Retrieval (SCAAR) framework. SCAAR iteratively leverages the semantic importance of words in upcoming sentences to dynamically adjust retrieval thresholds and filter information, retaining the top-𝛼% most semantically significant words for constructing retrieval queries. We comprehensively evaluate SCAAR against baseline methods across four long-form, knowledge-intensive generation datasets using four models. Our method achieved the highest score on each dataset with GPT-4o. Extensive experiments also analyze the impact of various hyperparameters within the framework. Our results demonstrate SCAAR’s superior or competitive performance, showcasing its ability to effectively detect model retrieval needs and construct efficient retrieval queries for relevant knowledge about problem-solving in black-box scenarios. Our code is available on https://github.com/linqinhong/SAC.
Search
Fix author
Co-authors
- Yuang Cai 1
- Yu Li (李豫, 李宇) 1
- Qinhong Lin 1
- Xuan Xu 1
- Zhongliang Yang 1
- show all...