Improving the Sample Efficiency of Prompt Tuning with Domain Adaptation

Xu Guo, Boyang Li, Han Yu


Abstract
Prompt tuning, or the conditioning of a frozen pretrained language model (PLM) with soft prompts learned from data, has demonstrated impressive performance on a wide range of NLP tasks. However, prompt tuning requires a large training dataset to be effective and is outperformed by finetuning the entire PLM in data-scarce regimes. Previous work (Gu et al., 2022, Vu et al., 2022) proposed to transfer soft prompts pretrained on the source domain to the target domain. In this paper, we explore domain adaptation for prompt tuning, a problem setting where unlabeled data from the target domain are available during pretraining. We propose bOosting Prompt TunIng with doMain Adaptation (OPTIMA), which regularizes the decision boundary to be smooth around regions where source and target data distributions are similar. Extensive experiments demonstrate that OPTIMA significantly enhances the transferability and sample-efficiency of prompt tuning compared to strong baselines. Moreover, in few-shot settings, OPTIMA exceeds full-model tuning by a large margin.
Anthology ID:
2022.findings-emnlp.258
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3523–3537
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.258
DOI:
10.18653/v1/2022.findings-emnlp.258
Bibkey:
Cite (ACL):
Xu Guo, Boyang Li, and Han Yu. 2022. Improving the Sample Efficiency of Prompt Tuning with Domain Adaptation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 3523–3537, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Improving the Sample Efficiency of Prompt Tuning with Domain Adaptation (Guo et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.258.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.258.mp4