Logit Space Constrained Fine-Tuning for Mitigating Hallucinations in LLM-Based Recommender Systems

Jianfeng Deng, Qingfeng Chen, Debo Cheng, Jiuyong Li, Lin Liu


Abstract
Large language models (LLMs) have gained increasing attention in recommender systems, but their inherent hallucination issues significantly compromise the accuracy and reliability of recommendation results. Existing LLM-based recommender systems predominantly rely on standard fine-tuning methodologies, often ignoring hallucination issues during the fine-tuning process. To address this challenge, we propose Logit Space Constraints Fine-Tuning (LCFT), a novel fine-tuning framework designed to mitigate hallucination in LLM-based recommenders. Specifically, LCFT takes as input semantically positive and negative instruction pairs and incorporates Kullback–Leibler (KL) divergence into the training objective to explicitly maximise their distributional disparity in the logit space. By conducting such logit space-constrained fine-tuning, LCFT encourages more distinguishable and semantically grounded representations, thereby reducing the model’s susceptibility to hallucination. Extensive experiments on two recommendation models with distinct LLM backbones and four real-world datasets demonstrate that LCFT consistently reduces hallucination and enhances recommendation performance.
Anthology ID:
2025.emnlp-main.1491
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29299–29312
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1491/
DOI:
Bibkey:
Cite (ACL):
Jianfeng Deng, Qingfeng Chen, Debo Cheng, Jiuyong Li, and Lin Liu. 2025. Logit Space Constrained Fine-Tuning for Mitigating Hallucinations in LLM-Based Recommender Systems. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 29299–29312, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Logit Space Constrained Fine-Tuning for Mitigating Hallucinations in LLM-Based Recommender Systems (Deng et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1491.pdf
Checklist:
 2025.emnlp-main.1491.checklist.pdf