Learning How to Ask: Querying LMs with Mixtures of Soft Prompts

Guanghui Qin, Jason Eisner


Abstract
Natural-language prompts have recently been used to coax pretrained language models into performing other AI tasks, using a fill-in-the-blank paradigm (Petroni et al., 2019) or a few-shot extrapolation paradigm (Brown et al., 2020). For example, language models retain factual knowledge from their training corpora that can be extracted by asking them to “fill in the blank” in a sentential prompt. However, where does this prompt come from? We explore the idea of learning prompts by gradient descent—either fine-tuning prompts taken from previous work, or starting from random initialization. Our prompts consist of “soft words,” i.e., continuous vectors that are not necessarily word type embeddings from the language model. Furthermore, for each task, we optimize a mixture of prompts, learning which prompts are most effective and how to ensemble them. Across multiple English LMs and tasks, our approach hugely outperforms previous methods, showing that the implicit factual knowledge in language models was previously underestimated. Moreover, this knowledge is cheap to elicit: random initialization is nearly as good as informed initialization.
Anthology ID:
2021.naacl-main.410
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5203–5212
Language:
URL:
https://aclanthology.org/2021.naacl-main.410
DOI:
10.18653/v1/2021.naacl-main.410
Award:
 Best Short Paper
Bibkey:
Cite (ACL):
Guanghui Qin and Jason Eisner. 2021. Learning How to Ask: Querying LMs with Mixtures of Soft Prompts. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5203–5212, Online. Association for Computational Linguistics.
Cite (Informal):
Learning How to Ask: Querying LMs with Mixtures of Soft Prompts (Qin & Eisner, NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2021.naacl-main.410.pdf
Optional supplementary code:
 2021.naacl-main.410.OptionalSupplementaryCode.zip
Optional supplementary data:
 2021.naacl-main.410.OptionalSupplementaryData.zip
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2021.naacl-main.410.mp4
Code
 hiaoxui/soft-prompts +  additional community code
Data
ConceptNetLAMAT-REx