Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing
Changyue Wang, Weihang Su, Qingyao Ai, Yujia Zhou, Yiqun Liu
Abstract
Knowledge editing enables efficient updates to Large Language Models (LLMs) by modifying specific knowledge without full-model retraining. Among knowledge editing approaches, in-context editing (ICE) stands out for its ability to inject knowledge without modifying the model’s parameters. However, existing ICE approaches directly edit model context without isolating target knowledge from the reasoning path of model inference, resulting in unreliable and low-quality outputs, particularly in multi-hop tasks. To investigate this issue, we analyze the interaction between reasoning path planning and knowledge injection, showing that the reasoning ability of a LLM is usually coupled with its original knowledge, and directly replacing old knowledge with new one could simultaneously hurt the LLM’s performance in task reasoning. Based on these findings, we propose DecKER, a novel ICE framework that separates model reasoning from knowledge editing. Extensive experiments show that DecKER significantly improves multi-hop reasoning performance by mitigating knowledge conflicts and preserving reasoning integrity.- Anthology ID:
- 2025.findings-acl.1260
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2025
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 24543–24562
- Language:
- URL:
- https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1260/
- DOI:
- 10.18653/v1/2025.findings-acl.1260
- Cite (ACL):
- Changyue Wang, Weihang Su, Qingyao Ai, Yujia Zhou, and Yiqun Liu. 2025. Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing. In Findings of the Association for Computational Linguistics: ACL 2025, pages 24543–24562, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing (Wang et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/mtsummit-25-ingestion/2025.findings-acl.1260.pdf