Abstract
Few-shot classification has made great strides due to foundation models that, through priming and prompting, are highly effective few-shot learners. However, this approach has high variance both across different sets of few shots (*data selection*) and across different finetuning runs (*run variability*). This is problematic not only because it impedes the fair comparison of different approaches, but especially because it makes few-shot learning too unreliable for many real-world applications. To alleviate these issues, we make two contributions for more stable and effective few-shot learning: First, we propose novel ensembling methods and show that they substantially reduce *run variability*. Second, we introduce a new active learning (AL) criterion for *data selection* and present the first AL-based approach specifically tailored towards prompt-based learning. In our experiments, we show that our combined method, MEAL (**M**ultiprompt finetuning and prediction **E**nsembling with **A**ctive **L**earning), improves overall performance of prompt-based finetuning by 2.3 points on five diverse tasks. We publicly share our code and data splits in https://github.com/akoksal/MEAL.- Anthology ID:
- 2023.findings-emnlp.36
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 506–517
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.36
- DOI:
- 10.18653/v1/2023.findings-emnlp.36
- Cite (ACL):
- Abdullatif Köksal, Timo Schick, and Hinrich Schuetze. 2023. MEAL: Stable and Active Learning for Few-Shot Prompting. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 506–517, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- MEAL: Stable and Active Learning for Few-Shot Prompting (Köksal et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2023.findings-emnlp.36.pdf