Parameter Efficient Multi-task Fine-tuning by Learning to Transfer Token-wise Prompts
Muling Wu, Wenhao Liu, Jianhan Xu, Changze Lv, Zixuan Ling, Tianlong Li, Longtao Huang, Xiaoqing Zheng, Xuanjing Huang
Abstract
Prompt tuning has been proven to be successful on various tasks by incorporating a small number of trainable parameters while freezing large pre-trained language models (PLMs). However, it is still unsettled how to generate more proper prompts for any individual examples and how to extend prompt tuning to multi-task learning scenarios by leveraging cross-task features. To address these challenges, we propose a token-wise prompt tuning (TPT), in which a bank of finer-grained soft prompt tokens is built for multi-task learning by memory network. The tokens are retrieved from the bank against an input example and assembled to an instance-dependent prompt. Extensive experimental results on 14 datasets demonstrated that the models enhanced by our TPT performed far better than full parameter fine-tuned models and achieved state-of-the-art by tuning only 0.035% parameters.- Anthology ID:
- 2023.findings-emnlp.584
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 8734–8746
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.584
- DOI:
- 10.18653/v1/2023.findings-emnlp.584
- Cite (ACL):
- Muling Wu, Wenhao Liu, Jianhan Xu, Changze Lv, Zixuan Ling, Tianlong Li, Longtao Huang, Xiaoqing Zheng, and Xuanjing Huang. 2023. Parameter Efficient Multi-task Fine-tuning by Learning to Transfer Token-wise Prompts. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8734–8746, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Parameter Efficient Multi-task Fine-tuning by Learning to Transfer Token-wise Prompts (Wu et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.584.pdf