Few-Shot Text Generation with Natural Language Instructions

Timo Schick, Hinrich Schütze


Abstract
Providing pretrained language models with simple task descriptions in natural language enables them to solve some tasks in a fully unsupervised fashion. Moreover, when combined with regular learning from examples, this idea yields impressive few-shot results for a wide range of text classification tasks. It is also a promising direction to improve data efficiency in generative settings, but there are several challenges to using a combination of task descriptions and example-based learning for text generation. In particular, it is crucial to find task descriptions that are easy to understand for the pretrained model and to ensure that it actually makes good use of them; furthermore, effective measures against overfitting have to be implemented. In this paper, we show how these challenges can be tackled: We introduce GenPET, a method for text generation that is based on pattern-exploiting training, a recent approach for combining textual instructions with supervised learning that only works for classification tasks. On several summarization and headline generation datasets, GenPET gives consistent improvements over strong baselines in few-shot settings.
Anthology ID:
2021.emnlp-main.32
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
390–402
Language:
URL:
https://aclanthology.org/2021.emnlp-main.32
DOI:
10.18653/v1/2021.emnlp-main.32
Bibkey:
Cite (ACL):
Timo Schick and Hinrich Schütze. 2021. Few-Shot Text Generation with Natural Language Instructions. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 390–402, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Few-Shot Text Generation with Natural Language Instructions (Schick & Schütze, EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2021.emnlp-main.32.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2021.emnlp-main.32.mp4
Data
AESLCReddit TIFU