Sarah Wyer
2024
Self-Regulated Sample Diversity in Large Language Models
Mingyue Liu
|
Jonathan Frawley
|
Sarah Wyer
|
Hubert P. H. Shum
|
Sara Uckelman
|
Sue Black
|
Chris Willcocks
Findings of the Association for Computational Linguistics: NAACL 2024
Sample diversity depends on the task; within mathematics, precision and determinism are paramount, while storytelling thrives on creativity and surprise. This paper presents a simple self-regulating approach where we adjust sample diversity inference parameters dynamically based on the input prompt—in contrast to existing methods that require expensive and inflexible setups, or maintain static values during inference. Capturing a broad spectrum of sample diversities can be formulated as a straightforward self-supervised inference task, which we find significantly improves the quality of responses generically without model retraining or fine-tuning. In particular, our method demonstrates significant improvement in all supercategories of the MMLU multitask benchmark (GPT-3.5: +4.4%, GPT-4: +1.5%), which captures a large variety of difficult tasks covering STEM, the humanities and social sciences.
Search
Co-authors
- Mingyue Liu 1
- Jonathan Frawley 1
- Hubert P. H. Shum 1
- Sara Uckelman 1
- Sue Black 1
- show all...