SynapticRAG: Enhancing Temporal Memory Retrieval in Large Language Models through Synaptic Mechanisms

Yuki Hou, Haruki Tamoto, Qinghua Zhao, Homei Miyashita


Abstract
Existing retrieval methods in Large Language Models show degradation in accuracy when handling temporally distributed conversations, primarily due to their reliance on simple similarity-based retrieval. Unlike existing memory retrieval methods that rely solely on semantic similarity, we propose SynapticRAG, which uniquely combines temporal association triggers with biologically-inspired synaptic propagation mechanisms. Our approach uses temporal association triggers and synaptic-like stimulus propagation to identify relevant dialogue histories. A dynamic leaky integrate-and-fire mechanism then selects the most contextually appropriate memories. Experiments on four datasets of English, Chinese and Japanese show that compared to state-of-the-art memory retrieval methods, SynapticRAG achieves consistent improvements across multiple metrics up to 14.66% points. This work bridges the gap between cognitive science and language model development, providing a new framework for memory management in conversational systems.
Anthology ID:
2025.findings-acl.1048
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20422–20436
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.1048/
DOI:
Bibkey:
Cite (ACL):
Yuki Hou, Haruki Tamoto, Qinghua Zhao, and Homei Miyashita. 2025. SynapticRAG: Enhancing Temporal Memory Retrieval in Large Language Models through Synaptic Mechanisms. In Findings of the Association for Computational Linguistics: ACL 2025, pages 20422–20436, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
SynapticRAG: Enhancing Temporal Memory Retrieval in Large Language Models through Synaptic Mechanisms (Hou et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.1048.pdf