Abstract
While prompt tuning approaches have achieved competitive performance with high efficiency, we observe that they invariably employ the same initialization process, wherein the soft prompt is either randomly initialized or derived from an existing embedding vocabulary. In contrast to these conventional methods, this study aims to investigate an alternative way to derive soft prompt. Our empirical studies show that the soft prompt typically exhibits a low “intrinsic rank” characteristic. With such observations, we propose decomposed prompt tuning, a novel approach that utilizes low-rank matrices to initialize the soft prompt. Through the low-rank reparameterization, our method significantly reduces the number of trainable parameters while maintaining effectiveness. Experimental results on the SuperGLUE benchmark in both high-resource and low-resource scenarios demonstrate the effectiveness of the proposed method.- Anthology ID:
- 2023.findings-emnlp.890
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 13335–13347
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.890
- DOI:
- 10.18653/v1/2023.findings-emnlp.890
- Cite (ACL):
- Yao Xiao, Lu Xu, Jiaxi Li, Wei Lu, and Xiaoli Li. 2023. Decomposed Prompt Tuning via Low-Rank Reparameterization. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13335–13347, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Decomposed Prompt Tuning via Low-Rank Reparameterization (Xiao et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/jeptaln-2024-ingestion/2023.findings-emnlp.890.pdf