Persuasion Tokens for Editing Factual Knowledge in LLMs

Paul Youssef, Jörg Schlötterer, Christin Seifert


Abstract
In-context knowledge editing (IKE) is a promising technique for updating Large Language Models (LLMs) with new information. However, IKE relies on lengthy, fact-specific demonstrations which are costly to create and consume significant context window space. In this paper, we introduce persuasion tokens (P-Tokens) – special tokens trained to replicate the effect of IKE demonstrations, enabling efficient knowledge editing without requiring fact-specific demonstrations. We evaluate P-Tokens across two editing datasets and three LLMs, demonstrating performance comparable to, and often exceeding, IKE. We further find that editing performance is robust to distractors with small negative effects to neighboring facts, and that increasing the number of P-Tokens improves performance. Our work addresses key limitations of IKE and provides a more practical and scalable alternative for editing LLMs.
Anthology ID:
2026.eacl-short.35
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
475–486
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.35/
DOI:
Bibkey:
Cite (ACL):
Paul Youssef, Jörg Schlötterer, and Christin Seifert. 2026. Persuasion Tokens for Editing Factual Knowledge in LLMs. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 475–486, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Persuasion Tokens for Editing Factual Knowledge in LLMs (Youssef et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.35.pdf
Checklist:
 2026.eacl-short.35.checklist.pdf