ACCEPT: Adaptive Codebook for Composite and Efficient Prompt Tuning

Yu-Chen Lin, Wei-Hua Li, Jun-cheng Chen, Chu-Song Chen


Abstract
Prompt Tuning has been a popular Parameter-Efficient Fine-Tuning method attributed to its remarkable performance with few updated parameters on various large-scale pretrained Language Models (PLMs). Traditionally, each prompt has been considered indivisible and updated independently, leading the parameters increase proportionally as prompt length grows. To address this issue, we propose Adaptive Codebook for Composite and Efficient Prompt Tuning (ACCEPT). In our method, we refer to the concept of product quantization (PQ), allowing all soft prompts to share a set of learnable codebook vectors in each subspace, with each prompt differentiated by a set of adaptive weights. We achieve the superior performance on 17 diverse natural language tasks including natural language understanding (NLU) and question answering (QA) tasks by tuning only 0.3% of parameters of the PLMs. Our approach also excels in few-shot and large model settings, highlighting its significant potential.
Anthology ID:
2024.findings-emnlp.900
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15345–15358
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.900/
DOI:
10.18653/v1/2024.findings-emnlp.900
Bibkey:
Cite (ACL):
Yu-Chen Lin, Wei-Hua Li, Jun-cheng Chen, and Chu-Song Chen. 2024. ACCEPT: Adaptive Codebook for Composite and Efficient Prompt Tuning. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 15345–15358, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
ACCEPT: Adaptive Codebook for Composite and Efficient Prompt Tuning (Lin et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.900.pdf