Data-Efficient Automatic Prompt Optimization for Memory-Enhanced Conversational Agents

Ervine Zheng, Yikuan Li, Geoffrey Jay Tso, Jilong Kuang


Abstract
Automatic prompt optimization (APO) uses algorithms to automatically refine prompts for LLMs, effectively reducing human effort in prompt engineering. However, applying APO to memory-enhanced conversational agents presents unique challenges. These agents leverage memory to retain information from historical interactions with users and provide context-aware and personalized responses. Optimizing prompts for these agents is challenging due to their complex, interconnected modules that include memory writing, reading, and response generation. This paper introduces a data-efficient framework for APO in these agents. Our approach leverages LLMs to holistically optimize the prompts of all agents. We also introduce an automated evaluation module that not only provides a holistic quality score for responses but also performs error attribution, pinpointing failures within the specific modules. More importantly, to ensure the evaluation module aligns with human judgment, we develop a data-efficient active sampling algorithm with convex optimization to select the most informative samples for human feedback and prompt improvement. We conducted experiments on two health-related conversation datasets to demonstrate the effectiveness of the proposed framework.
Anthology ID:
2025.emnlp-industry.126
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1793–1804
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.126/
DOI:
Bibkey:
Cite (ACL):
Ervine Zheng, Yikuan Li, Geoffrey Jay Tso, and Jilong Kuang. 2025. Data-Efficient Automatic Prompt Optimization for Memory-Enhanced Conversational Agents. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1793–1804, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Data-Efficient Automatic Prompt Optimization for Memory-Enhanced Conversational Agents (Zheng et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.126.pdf