Local Normalization Distortion and the Thermodynamic Formalism of Decoding Strategies for Large Language Models

Tom Kempton, Stuart Burrell


Abstract
Advances in hardware and language model architecture have spurred a revolution in natural language generation. However, autoregressive models compute probability distributions over next-token choices, and sampling from these distributions, known as decoding, has received significantly less attention than other design choices. Existing decoding strategies are largely based on heuristics, resulting in methods that are difficult to apply or improve in a principled manner. We develop the theory of decoding strategies for language models by expressing popular decoding algorithms as equilibrium states in the language of ergodic theory and stating the objective functions they optimize. Using this, we analyze the effect of the local normalization step required to make probabilities sum to one in top-k, nucleus, and temperature sampling. We argue that local normalization distortion is a fundamental defect of decoding strategies and quantify the size of this distortion and its effect on mathematical proxies for the quality and diversity of generated text. This yields conclusions for the design of decoding algorithms and the detection of machine-generated text.
Anthology ID:
2025.findings-emnlp.1210
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22216–22231
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1210/
DOI:
10.18653/v1/2025.findings-emnlp.1210
Bibkey:
Cite (ACL):
Tom Kempton and Stuart Burrell. 2025. Local Normalization Distortion and the Thermodynamic Formalism of Decoding Strategies for Large Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 22216–22231, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Local Normalization Distortion and the Thermodynamic Formalism of Decoding Strategies for Large Language Models (Kempton & Burrell, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1210.pdf
Checklist:
 2025.findings-emnlp.1210.checklist.pdf