Boosting Long-Context Information Seeking via Query-Guided Activation Refilling

Hongjin Qian, Zheng Liu, Peitian Zhang, Zhicheng Dou, Defu Lian


Abstract
Processing long contexts poses a significant challenge for large language models (LLMs) due to their inherent context window limitations and the computational burden of extensive key-value (KV) activations, which severely impact efficiency. For information-seeking tasks, full context perception is often unnecessary, as a query’s information needs can dynamically range from localized details to a global perspective, depending on its complexity. However, existing methods struggle to adapt effectively to this dynamic information needs.In the paper, we propose a method for processing long-context information-seeking tasks via query-guided ACtivation REfilling (ACRE). ACRE constructs a Bi-layer KV Cache for long contexts, where the layer-1 (L1) cache compactly captures global information, and the layer-2 (L2) cache provides detailed, localized information. ACRE establishes a proxying relationship between the two caches, allowing the input query to attend to the L1 cache and dynamically refill it with relevant entries from the L2 cache. This mechanism integrates global understanding with query-specific local details, thereby enhancing answer decoding. Experiments on a variety of long-context information-seeking datasets demonstrate ACRE’s effectiveness, achieving significant improvements in both performance and efficiency.
Anthology ID:
2025.acl-long.465
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9453–9464
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.465/
DOI:
Bibkey:
Cite (ACL):
Hongjin Qian, Zheng Liu, Peitian Zhang, Zhicheng Dou, and Defu Lian. 2025. Boosting Long-Context Information Seeking via Query-Guided Activation Refilling. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9453–9464, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Boosting Long-Context Information Seeking via Query-Guided Activation Refilling (Qian et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.465.pdf