Active Layer-Contrastive Decoding Reduces Hallucination in Large Language Model Generation

Hongxiang Zhang, Hao Chen, Muhao Chen, Tianyi Zhang


Abstract
Recent decoding methods improve the factuality of large language models (LLMs) by refining how the next token is selected during generation. These methods typically operate at the token level, leveraging internal representations to suppress superficial patterns. Nevertheless, LLMs remain prone to hallucinations, especially over longer contexts. In this paper, we propose Active Layer-Contrastive Decoding (ActLCD), a novel decoding strategy that actively decides when to apply contrasting layers during generation. By casting decoding as a sequential decision-making problem, ActLCD employs a reinforcement learning policy guided by a reward-aware classifier to optimize factuality beyond the token level. Our experiments demonstrate that ActLCD surpasses state-of-the-art methods across five benchmarks, showcasing its effectiveness in mitigating hallucinations in diverse generation scenarios.
Anthology ID:
2025.emnlp-main.150
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3028–3046
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.150/
DOI:
Bibkey:
Cite (ACL):
Hongxiang Zhang, Hao Chen, Muhao Chen, and Tianyi Zhang. 2025. Active Layer-Contrastive Decoding Reduces Hallucination in Large Language Model Generation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 3028–3046, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Active Layer-Contrastive Decoding Reduces Hallucination in Large Language Model Generation (Zhang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.150.pdf
Checklist:
 2025.emnlp-main.150.checklist.pdf