Avoiding Knowledge Edit Skipping in Multi-hop Question Answering with Guided Decomposition

Yi Liu, Xiangrong Zhu, Xiangyu Liu, Wei Wei, Wei Hu


Abstract
In a rapidly evolving world where information updates swiftly, knowledge in large language models (LLMs) becomes outdated quickly. Retraining LLMs is not a cost-effective option, making knowledge editing (KE) without modifying parameters particularly necessary. We find that although existing retrieval-augmented generation (RAG)-based KE methods excel at editing simple knowledge, they struggle with KE in multi-hop question answering due to the issue of ”edit skipping”, which refers to skipping the relevant edited fact in inference. In addition to the diversity of natural language expressions of knowledge, edit skipping also arises from the mismatch between the granularity of LLMs in problem-solving and the facts in the edited memory. To address this issue, we propose a novel Iterative Retrieval-Augmented Knowledge Editing method with guided decomposition (IRAKE) through the guidance from single edited facts and entire edited cases. Experimental results demonstrate that IRAKE mitigates the failure of editing caused by edit skipping and outperforms state-of-the-art methods for KE in multi-hop question answering.
Anthology ID:
2025.findings-emnlp.883
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16256–16272
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.883/
DOI:
10.18653/v1/2025.findings-emnlp.883
Bibkey:
Cite (ACL):
Yi Liu, Xiangrong Zhu, Xiangyu Liu, Wei Wei, and Wei Hu. 2025. Avoiding Knowledge Edit Skipping in Multi-hop Question Answering with Guided Decomposition. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 16256–16272, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Avoiding Knowledge Edit Skipping in Multi-hop Question Answering with Guided Decomposition (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.883.pdf
Checklist:
 2025.findings-emnlp.883.checklist.pdf