SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts

Joon-Young Choi, Junho Kim, Jun-Hyung Park, Wing-Lam Mok, SangKeun Lee


Abstract
Prompt tuning has emerged as a successful parameter-efficient alternative to the full fine-tuning of language models. However, prior works on prompt tuning often utilize long soft prompts of up to 100 tokens to improve performance, overlooking the inefficiency associated with extended inputs. In this paper, we propose a novel prompt tuning method SMoP (Sparse Mixture-of-Prompts) that utilizes short soft prompts for efficient training and inference while maintaining performance gains typically induced from longer soft prompts. To achieve this, SMoP employs a gating mechanism to train multiple short soft prompts specialized in handling different subsets of the data, providing an alternative to relying on a single long soft prompt to cover the entire data. Experimental results demonstrate that SMoP outperforms baseline methods while reducing training and inference costs. We release our code at https://github.com/jyjohnchoi/SMoP.
Anthology ID:
2023.emnlp-main.884
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14306–14316
Language:
URL:
https://aclanthology.org/2023.emnlp-main.884
DOI:
10.18653/v1/2023.emnlp-main.884
Bibkey:
Cite (ACL):
Joon-Young Choi, Junho Kim, Jun-Hyung Park, Wing-Lam Mok, and SangKeun Lee. 2023. SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14306–14316, Singapore. Association for Computational Linguistics.
Cite (Informal):
SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts (Choi et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.emnlp-main.884.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2023.emnlp-main.884.mp4