PersonalityChat: Conversation Distillation for Personalized Dialog Modeling with Facts and Traits
Ehsan Lotfi, Maxime De Bruyn, Jeska Buhmann, Walter Daelemans
Abstract
The new wave of Large Language Models (LLM) has offered an efficient tool to curate sizeable conversational datasets. So far studies have mainly focused on task-oriented or generic open-domain dialogs, and have not fully explored the ability of LLMs in following complicated prompts. In this work, we focus on personalization, and employ LLMs to curate a dataset which is difficult and costly to crowd-source: PersonalityChat is a synthetic conversational dataset based upon the popular PersonaChat dataset, but conditioned on both personas and (Big-5) personality traits. Evaluating models fine-tuned on this dataset, we show that the personality trait labels can be used for trait-based personalization of generative dialogue models. We also perform a head-to-head comparison between PersonalityChat and PersonaChat, and show that training on the distilled dataset results in more fluent and coherent dialog agents in the small-model regime.- Anthology ID:
- 2023.gem-1.29
- Volume:
- Proceedings of the Third Workshop on Natural Language Generation, Evaluation, and Metrics (GEM)
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Sebastian Gehrmann, Alex Wang, João Sedoc, Elizabeth Clark, Kaustubh Dhole, Khyathi Raghavi Chandu, Enrico Santus, Hooman Sedghamiz
- Venues:
- GEM | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 353–371
- Language:
- URL:
- https://aclanthology.org/2023.gem-1.29
- DOI:
- Cite (ACL):
- Ehsan Lotfi, Maxime De Bruyn, Jeska Buhmann, and Walter Daelemans. 2023. PersonalityChat: Conversation Distillation for Personalized Dialog Modeling with Facts and Traits. In Proceedings of the Third Workshop on Natural Language Generation, Evaluation, and Metrics (GEM), pages 353–371, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- PersonalityChat: Conversation Distillation for Personalized Dialog Modeling with Facts and Traits (Lotfi et al., GEM-WS 2023)
- PDF:
- https://preview.aclanthology.org/ingest-2024-clasp/2023.gem-1.29.pdf