Position Really Matters: Towards a Holistic Approach for Prompt Tuning

Xianjun Yang, Wei Cheng, Xujiang Zhao, Wenchao Yu, Linda Ruth Petzold, Haifeng Chen


Abstract
Prompt tuning is highly effective in efficiently extracting knowledge from foundation models, encompassing both language, vision, and vision-language models. However, the efficacy of employing fixed soft prompts with a predetermined position for concatenation with inputs for all instances, irrespective of their inherent disparities, remains uncertain. Variables such as the position, length, and representations of prompts across diverse instances and tasks can substantially influence the performance of prompt tuning. We first provide a theoretical analysis, revealing that optimizing the position of the prompt to encompass the input can capture additional semantic information that traditional prefix or postfix prompt tuning methods fail to capture. Then, we present a holistic parametric prompt tuning strategy that dynamically determines different factors of prompts based on specific tasks or instances. Experimental results underscore the significant performance improvement achieved by dynamic prompt tuning across a wide range of tasks, including NLP, vision recognition, and vision-language tasks. Furthermore, we establish the universal applicability of our approach under full-data, few-shot, and multitask settings.
Anthology ID:
2025.findings-naacl.474
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8501–8523
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.474/
DOI:
Bibkey:
Cite (ACL):
Xianjun Yang, Wei Cheng, Xujiang Zhao, Wenchao Yu, Linda Ruth Petzold, and Haifeng Chen. 2025. Position Really Matters: Towards a Holistic Approach for Prompt Tuning. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 8501–8523, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Position Really Matters: Towards a Holistic Approach for Prompt Tuning (Yang et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.474.pdf