In-Context Example Ordering Guided by Label Distributions

Zhichao Xu, Daniel Cohen, Bei Wang, Vivek Srikumar


Abstract
By allowing models to predict without task-specific training, in-context learning (ICL) with pretrained LLMs has enormous potential in NLP. However, a number of problems persist in ICL. In particular, its performance is sensitive to the choice and order of in-context examples. Given the same set of in-context examples with different orderings, model performance may vary from near random to near state-of-the-art. In this work, we formulate in-context example ordering as an optimization problem. We examine three problem settings that differ in the assumptions they make about what is known about the task. Inspired by the idea of learning from label proportions, we propose two principles for in-context example ordering guided by model’s probability predictions. We apply our proposed principles to thirteen text classification datasets and nine different autoregressive LLMs with 700M to 13B parameters. We demonstrate that our approach outperforms the baselines by improving the classification accuracy, reducing model miscalibration, and also by selecting better in-context examples.
Anthology ID:
2024.findings-naacl.167
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2623–2640
Language:
URL:
https://aclanthology.org/2024.findings-naacl.167
DOI:
Bibkey:
Cite (ACL):
Zhichao Xu, Daniel Cohen, Bei Wang, and Vivek Srikumar. 2024. In-Context Example Ordering Guided by Label Distributions. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 2623–2640, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
In-Context Example Ordering Guided by Label Distributions (Xu et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2024.findings-naacl.167.pdf
Copyright:
 2024.findings-naacl.167.copyright.pdf