RelEdit: Evaluating Conceptual Knowledge Editing in Language Models via Relational Reasoning

Yifan Niu, Miao Peng, Nuo Chen, Yatao Bian, Tingyang Xu, Jia Li


Abstract
The conceptual knowledge in Large Language Models (LLMs) can become outdated over time, and concept editing is often an option. Current evaluations on conceptual knowledge editing primarily focus on whether the definitions of concepts are successfully edited, neglecting the impact on the model’s related beliefs. To address this gap, we introduce a benchmark called RelEdit, which includes criteria and questions to assess both concept-level and instance-level relational reasoning abilities of edited models. Our findings reveal that existing knowledge editing methods struggle to reason about related conceptual knowledge effectively. Additionally, we introduce a simple memory-based in-context editing baseline, MICE, which prompts the language model to generate answers that align with the stored edited concepts in external memory. In addition, we find that MICE obtains the best scores on our benchmark, suggesting a promising research direction for model editing.
Anthology ID:
2025.findings-acl.533
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10220–10238
Language:
URL:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.533/
DOI:
Bibkey:
Cite (ACL):
Yifan Niu, Miao Peng, Nuo Chen, Yatao Bian, Tingyang Xu, and Jia Li. 2025. RelEdit: Evaluating Conceptual Knowledge Editing in Language Models via Relational Reasoning. In Findings of the Association for Computational Linguistics: ACL 2025, pages 10220–10238, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
RelEdit: Evaluating Conceptual Knowledge Editing in Language Models via Relational Reasoning (Niu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.533.pdf