MindBridge: Scalable and Cross-Model Knowledge Editing via Memory-Augmented Modality

Shuaike Li, Kai Zhang, Qi Liu, Enhong Chen


Abstract
Knowledge editing is a technique for efficiently and accurately updating the knowledge of large language models (LLMs) to alleviate obsolescence and correct errors. However, most existing methods overfit to specific models, causing edited knowledge to be discarded during each LLM update and requiring frequent re-editing, which is particularly burdensome in today’s rapidly evolving open-source community. To address this issue, we propose the problem of cross-model knowledge editing and introduce **MindBridge**, a scalable solution inspired by the low coupling between modality processing and LLMs in multi-modal models. MindBridge introduces the novel concept of **memory modality**, which encodes edited knowledge as an independent modality. It first performs LLM-agnostic pre-training of the memory modality and then integrates it with various LLMs. Extensive experiments on multiple LLMs and popular knowledge editing datasets demonstrate that MindBridge achieves superior performance even in editing tens of thousands of knowledge entries and can flexibly adapt to different LLMs. Our code is available at https://github.com/CrashBugger/MindBridge.
Anthology ID:
2025.findings-acl.621
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11999–12013
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.621/
DOI:
Bibkey:
Cite (ACL):
Shuaike Li, Kai Zhang, Qi Liu, and Enhong Chen. 2025. MindBridge: Scalable and Cross-Model Knowledge Editing via Memory-Augmented Modality. In Findings of the Association for Computational Linguistics: ACL 2025, pages 11999–12013, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
MindBridge: Scalable and Cross-Model Knowledge Editing via Memory-Augmented Modality (Li et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.621.pdf