Am I Me or You? State-of-the-Art Dialogue Models Cannot Maintain an Identity

Kurt Shuster, Jack Urbanek, Arthur Szlam, Jason Weston


Abstract
State-of-the-art dialogue models still often stumble with regards to factual accuracy and self-contradiction. Anecdotally, they have been observed to fail to maintain character identity throughout discourse; and more specifically, may take on the role of their interlocutor. In this work we formalize and quantify this deficiency, and show experimentally through human evaluations that this is indeed a problem. In contrast, we show that discriminative models trained specifically to recognize who is speaking can perform well; and further, these can be used as automated metrics. Finally, we evaluate a wide variety of mitigation methods, including changes to model architecture, training protocol, and decoding strategy. Our best models reduce mistaken identity issues by nearly 65% according to human annotators, while simultaneously improving engagingness. Despite these results, we find that maintaining character identity still remains a challenging problem.
Anthology ID:
2022.findings-naacl.182
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2367–2387
Language:
URL:
https://aclanthology.org/2022.findings-naacl.182
DOI:
10.18653/v1/2022.findings-naacl.182
Bibkey:
Cite (ACL):
Kurt Shuster, Jack Urbanek, Arthur Szlam, and Jason Weston. 2022. Am I Me or You? State-of-the-Art Dialogue Models Cannot Maintain an Identity. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2367–2387, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Am I Me or You? State-of-the-Art Dialogue Models Cannot Maintain an Identity (Shuster et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.findings-naacl.182.pdf
Video:
 https://preview.aclanthology.org/remove-xml-comments/2022.findings-naacl.182.mp4