MPTA: MultiTask Personalization Assessment

Matthieu Tehenan, Eric Chamoun, Andreas Vlachos


Abstract
Large language models are increasingly expected to adapt to individual users, reflecting differences in preferences, values, and communication styles. To evaluate whether models can serve diverse populations, we introduce MTPA, a benchmark that leverages large-scale survey data (WVS, EVS, GSS) to construct real, hyper-granular personas spanning demographics, beliefs, and values. Unlike prior benchmarks that rely on synthetic profiles or narrow trait prediction, MTPA conditions models on real personas and systematically tests their behavior across core alignment tasks. We show that persona conditioning exposes pluralistic misalignment: while aggregate metrics suggest models are truthful and safe, subgroup-specific evaluations reveal hidden pockets of degraded factuality, fairness disparities, and inconsistent value alignment. Alongside the benchmark, we release a dataset, toolkit, and baseline evaluations. MTPA is designed with extensibility and sustainability in mind: as the underlying survey datasets are regularly updated, MTPA supports regular integration of new populations and user traits.
Anthology ID:
2025.findings-emnlp.640
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11982–11992
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.640/
DOI:
10.18653/v1/2025.findings-emnlp.640
Bibkey:
Cite (ACL):
Matthieu Tehenan, Eric Chamoun, and Andreas Vlachos. 2025. MPTA: MultiTask Personalization Assessment. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 11982–11992, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
MPTA: MultiTask Personalization Assessment (Tehenan et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.640.pdf
Checklist:
 2025.findings-emnlp.640.checklist.pdf