Uncertainty-Aware Contrastive Decoding

Hakyung Lee, Subeen Park, Joowang Kim, Sungjun Lim, Kyungwoo Song


Abstract
Large language models excel in a wide range of natural language processing tasks, but generating factually accurate and consistent outputs remains a challenge. To improve text reliability, Contrastive Decoding (CD) refines token selection by leveraging differences between an expert and base model, penalizing low-quality token choices. However, CD employs static weighting between models, making it sensitive to variations in model architecture and input characteristics, often resulting in suboptimal token selection and error propagation throughout generation. We propose Uncertainty-Aware Contrastive Decoding (UCD), a method that dynamically adjusts model contributions at each decoding step based on uncertainty. We introduce a cumulative energy function, where uncertainty is quantified as the negative log-sum-exp over logits, and decomposed into entropy and expected logit components. This energy serves as a dynamic confidence signal, guiding adaptive model weighting during generation. We demonstrate through extensive experiments that UCD significantly improves factual accuracy and reliability over existing decoding methods. Finally, we provide a theoretical analysis showing that our energy function serves as a well-defined uncertainty metric capturing model confidence. Our code is available at: https://github.com/MLAI-Yonsei/UCD.
Anthology ID:
2025.findings-acl.1352
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
26376–26391
Language:
URL:
https://preview.aclanthology.org/transition-to-people-yaml/2025.findings-acl.1352/
DOI:
10.18653/v1/2025.findings-acl.1352
Bibkey:
Cite (ACL):
Hakyung Lee, Subeen Park, Joowang Kim, Sungjun Lim, and Kyungwoo Song. 2025. Uncertainty-Aware Contrastive Decoding. In Findings of the Association for Computational Linguistics: ACL 2025, pages 26376–26391, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Uncertainty-Aware Contrastive Decoding (Lee et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/transition-to-people-yaml/2025.findings-acl.1352.pdf