Abstract
This study explores the consistency of personality traits in quantized large language models (LLMs) for edge device role-playing scenarios. Using the Big Five personality traits model, we evaluate how stable assigned personalities are for Quantized Role-Playing Dialog Agents (QRPDA) during multi-turn interactions. We evaluate multiple LLMs with various quantization levels, combining binary indexing of personality traits, explicit self-assessments, and linguistic analysis of narratives. To address personality inconsistency, we propose a non-parametric method called Think2. Our multi-faceted evaluation framework demonstrates Think2’s effectiveness in maintaining consistent personality traits for QRPDA. Moreover, we offer insights to help select the optimal model for QRPDA, improving its stability and reliability in real-world applications.- Anthology ID:
- 2024.emnlp-industry.19
- Volume:
- Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, US
- Editors:
- Franck Dernoncourt, Daniel Preoţiuc-Pietro, Anastasia Shimorina
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 239–255
- Language:
- URL:
- https://preview.aclanthology.org/remove-affiliations/2024.emnlp-industry.19/
- DOI:
- 10.18653/v1/2024.emnlp-industry.19
- Cite (ACL):
- Yixiao Wang, Homa Fashandi, and Kevin Ferreira. 2024. Investigating the Personality Consistency in Quantized Role-Playing Dialogue Agents. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 239–255, Miami, Florida, US. Association for Computational Linguistics.
- Cite (Informal):
- Investigating the Personality Consistency in Quantized Role-Playing Dialogue Agents (Wang et al., EMNLP 2024)
- PDF:
- https://preview.aclanthology.org/remove-affiliations/2024.emnlp-industry.19.pdf