Prompting ELECTRA: Few-Shot Learning with Discriminative Pre-Trained Models
Mengzhou Xia, Mikel Artetxe, Jingfei Du, Danqi Chen, Veselin Stoyanov
Abstract
Pre-trained masked language models successfully perform few-shot learning by formulating downstream tasks as text infilling. How- ever, as a strong alternative in full-shot settings, discriminative pre-trained models like ELECTRA do not fit into the paradigm. In this work, we adapt prompt-based few-shot learning to ELECTRA and show that it outperforms masked language models in a wide range of tasks. ELECTRA is pre-trained to distinguish if a token is generated or original. We naturally extend that to prompt-based few-shot learning by training to score the originality of the target options without introducing new parameters. Our method can be easily adapted to tasks involving multi-token predictions without extra computation overhead. Analysis shows that ELECTRA learns distributions that align better with downstream tasks.- Anthology ID:
- 2022.emnlp-main.780
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 11351–11361
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.780
- DOI:
- 10.18653/v1/2022.emnlp-main.780
- Cite (ACL):
- Mengzhou Xia, Mikel Artetxe, Jingfei Du, Danqi Chen, and Veselin Stoyanov. 2022. Prompting ELECTRA: Few-Shot Learning with Discriminative Pre-Trained Models. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 11351–11361, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Prompting ELECTRA: Few-Shot Learning with Discriminative Pre-Trained Models (Xia et al., EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2022.emnlp-main.780.pdf