R3Mem: Bridging Memory Retention and Retrieval via Reversible Compression

Xiaoqiang Wang, Suyuchen Wang, Yun Zhu, Bang Liu


Abstract
Memory plays a key role in enhancing LLMs’ performance when deployed to real-world applications. Existing solutions face trade-offs: explicit memory designs based on external storage require complex management and incur storage overhead, while implicit memory designs that store information via parameters struggle with reliable retrieval. In this paper, we propose R3Mem, a memory network that optimizes both information Retention and Retrieval through Reversible context compression. Specifically, R3Mem employs virtual memory tokens to compress and encode infinitely long histories, further enhanced by a hierarchical compression strategy that refines information from document- to entity-level for improved assimilation across granularities. For retrieval, R3Mem employs a reversible architecture, reconstructing raw data by invoking the model backward with compressed information. Implemented via parameter-efficient fine-tuning, it can integrate seamlessly with any Transformer-based model. Experiments demonstrate that our memory design achieves state-of-the-art performance in long-context language modeling and retrieval-augmented generation tasks. It also significantly outperforms conventional memory modules in long-horizon interaction tasks like conversational agents, showcasing its potential for next-generation retrieval systems.
Anthology ID:
2025.findings-acl.235
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4541–4557
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.235/
DOI:
Bibkey:
Cite (ACL):
Xiaoqiang Wang, Suyuchen Wang, Yun Zhu, and Bang Liu. 2025. R3Mem: Bridging Memory Retention and Retrieval via Reversible Compression. In Findings of the Association for Computational Linguistics: ACL 2025, pages 4541–4557, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
R3Mem: Bridging Memory Retention and Retrieval via Reversible Compression (Wang et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.235.pdf