FaST: Feature-aware Sampling and Tuning for Personalized Preference Alignment with Limited Data

Thibaut Thonet, Germán Kruszewski, Jos Rozen, Pierre Erbacher, Marc Dymetman


Abstract
LLM-powered conversational assistants are often deployed in a one-size-fits-all manner, which fails to accommodate individual user preferences. Recently, LLM personalization – tailoring models to align with specific user preferences – has gained increasing attention as a way to bridge this gap. In this work, we specifically focus on a practical yet challenging setting where only a small set of preference annotations can be collected per user – a problem we define as Personalized Preference Alignment with Limited Data (PPALLI). To support research in this area, we introduce two datasets – DnD and ELIP – and benchmark a variety of alignment techniques on them. We further propose FaST, a highly parameter-efficient approach that leverages high-level features automatically discovered from the data, achieving the best overall performance.
Anthology ID:
2025.emnlp-main.475
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9352–9381
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.475/
DOI:
Bibkey:
Cite (ACL):
Thibaut Thonet, Germán Kruszewski, Jos Rozen, Pierre Erbacher, and Marc Dymetman. 2025. FaST: Feature-aware Sampling and Tuning for Personalized Preference Alignment with Limited Data. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 9352–9381, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
FaST: Feature-aware Sampling and Tuning for Personalized Preference Alignment with Limited Data (Thonet et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.475.pdf
Checklist:
 2025.emnlp-main.475.checklist.pdf