Personalized LLM Decoding via Contrasting Personal Preference

Hyungjune Bu, ChanJoo Jung, Minjae Kang, Jaehyung Kim


Abstract
As large language models (LLMs) are progressively deployed in various real-world applications, personalization of LLMs has become increasingly important. While various approaches to LLM personalization such as prompt-based and training-based methods have been actively explored, the development of effective decoding-time algorithms remains largely overlooked, despite their demonstrated potential. In this paper, we propose Contrasting Personal Preference (CoPe), a novel decoding-time approach applied after performing parameter-efficient fine-tuning (PEFT) on user-specific data. Our core idea is to leverage reward-guided decoding specifically for personalization by maximizing each user’s implicit reward signal. We evaluate CoPe across five open-ended personalized text generation tasks. Our empirical results demonstrate that CoPe achieves strong performance, improving personalization by an average of 10.57% in ROUGE-L without relying on external reward models or additional training procedures.
Anthology ID:
2025.emnlp-main.1723
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
33946–33966
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1723/
DOI:
Bibkey:
Cite (ACL):
Hyungjune Bu, ChanJoo Jung, Minjae Kang, and Jaehyung Kim. 2025. Personalized LLM Decoding via Contrasting Personal Preference. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 33946–33966, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Personalized LLM Decoding via Contrasting Personal Preference (Bu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1723.pdf
Checklist:
 2025.emnlp-main.1723.checklist.pdf