Surprise Calibration for Better In-Context Learning

Zhihang Tan, Jingrui Hou, Ping Wang, Qibiao Hu, Peng Zhu


Abstract
In-context learning (ICL) has emerged as a powerful paradigm for task adaptation in large language models (LLMs), where models infer underlying task structures from a few demonstrations. However, ICL remains susceptible to biases that arise from prior knowledge and contextual demonstrations, which can degrade the performance of LLMs. Existing bias calibration methods typically apply fixed class priors across all inputs, limiting their efficacy in dynamic ICL settings where the context for each query differs. To address these limitations, we adopt implicit sequential Bayesian inference as a framework for interpreting ICL, identify “surprise” as an informative signal for class prior shift, and introduce a novel method—Surprise Calibration (SC). SC leverages the notion of surprise to capture the temporal dynamics of class priors, providing a more adaptive and computationally efficient solution for in-context learning. We empirically demonstrate the superiority of SC over existing bias calibration techniques across a range of benchmark natural language processing tasks.
Anthology ID:
2025.emnlp-main.1175
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23056–23071
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1175/
DOI:
Bibkey:
Cite (ACL):
Zhihang Tan, Jingrui Hou, Ping Wang, Qibiao Hu, and Peng Zhu. 2025. Surprise Calibration for Better In-Context Learning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 23056–23071, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Surprise Calibration for Better In-Context Learning (Tan et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1175.pdf
Checklist:
 2025.emnlp-main.1175.checklist.pdf