Direct Prompt Optimization with Continuous Representations

Yangkun Wang, Zihan Wang, Jingbo Shang


Abstract
Prompt optimization for language models faces challenges due to the large discrete search space, the reliance on continuous gradient updates, and the need to round continuous representations into discrete prompts, which causes inflexibility and instability. Existing methods attempt to address these by constraining the search space and adopting greedy, incremental improvements, but they often fail to fully leverage historical gradient information. In this paper, we model the prompt optimization problem by the probability distribution of the prompt and present a novel approach that integrates greedy strategies into optimization with continuous representations. This approach can exploit historical gradient information to address the instability caused by rounding in existing methods. Our study indicates that using continuous representations can improve prompt optimization performance on both text classification and attack tasks, as well as models, including GPT-2, OPT, Vicuna, and LLaMA-2, and also be adaptable to models of different sizes.
Anthology ID:
2025.acl-long.133
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2642–2652
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.133/
DOI:
Bibkey:
Cite (ACL):
Yangkun Wang, Zihan Wang, and Jingbo Shang. 2025. Direct Prompt Optimization with Continuous Representations. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2642–2652, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Direct Prompt Optimization with Continuous Representations (Wang et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.133.pdf