Conflict-Aware Soft Prompting for Retrieval-Augmented Generation

Eunseong Choi, June Park, Hyeri Lee, Jongwuk Lee


Abstract
Retrieval-augmented generation (RAG) enhances the capabilities of large language models (LLMs) by incorporating external knowledge into their input prompts. However, when the retrieved context contradicts the LLM’s parametric knowledge, it often fails to resolve the conflict between incorrect external context and correct parametric knowledge, known as context-memory conflict. To tackle this problem, we introduce Conflict-Aware REtrieval-Augmented Generation (CARE), consisting of a context assessor and a base LLM. The context assessor encodes external context into compact memory embeddings. Through grounded/adversarial soft prompting, the context assessor is trained to discern unreliable context and capture a guidance signal that directs reasoning toward the more reliable knowledge source. Extensive experiments show that CARE effectively mitigates context-memory conflicts, leading to an average performance gain of 5.0% on QA and fact-checking benchmarks, establishing a promising direction for trustworthy and adaptive RAG systems.
Anthology ID:
2025.emnlp-main.1371
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26969–26983
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1371/
DOI:
Bibkey:
Cite (ACL):
Eunseong Choi, June Park, Hyeri Lee, and Jongwuk Lee. 2025. Conflict-Aware Soft Prompting for Retrieval-Augmented Generation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 26969–26983, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Conflict-Aware Soft Prompting for Retrieval-Augmented Generation (Choi et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1371.pdf
Checklist:
 2025.emnlp-main.1371.checklist.pdf