InfiniteICL: Breaking the Limit of Context Window Size via Long Short-term Memory Transformation

Bowen Cao, Deng Cai, Wai Lam


Abstract
In-context learning (ICL) is critical for large language models (LLMs), but its effectiveness is constrained by finite context windows, particularly in ultra-long contexts. To overcome this, we introduce **InfiniteICL**, a framework that parallels context and parameters in LLMs with short- and long-term memory in human cognitive systems, focusing on transforming temporary context knowledge into permanent parameter updates. This approach significantly reduces memory usage, maintains robust performance across varying input lengths, and theoretically enables infinite context integration through the principles of context knowledge elicitation, selection, and consolidation. Evaluations demonstrate that our method reduces context length by 90% while achieving 103% average performance of full-context prompting across fact recall, grounded reasoning, and skill acquisition tasks. When conducting sequential multi-turn transformations on complex, real-world contexts (with length up to 2M tokens), our approach surpasses full-context prompting while using only 0.4% of the original contexts. These findings highlight InfiniteICL’s potential to enhance the scalability and efficiency of LLMs by breaking the limitations of conventional context window sizes.
Anthology ID:
2025.findings-acl.595
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11402–11415
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.595/
DOI:
Bibkey:
Cite (ACL):
Bowen Cao, Deng Cai, and Wai Lam. 2025. InfiniteICL: Breaking the Limit of Context Window Size via Long Short-term Memory Transformation. In Findings of the Association for Computational Linguistics: ACL 2025, pages 11402–11415, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
InfiniteICL: Breaking the Limit of Context Window Size via Long Short-term Memory Transformation (Cao et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.595.pdf