Abstract
Emotions are an integral part of human cognition and they guide not only our understanding of the world but also our actions within it. As such, whether we soothe or flame an emotion is not inconsequential. Recent work in conversational AI has focused on responding empathetically to users, validating and soothing their emotions without a real basis. This AI-aided emotional regulation can have negative consequences for users and society, tending towards a one-noted happiness defined as only the absence of “negative” emotions. We argue that we must carefully consider whether and how to respond to users’ emotions.- Anthology ID:
- 2023.findings-acl.515
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 8123–8130
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.515
- DOI:
- 10.18653/v1/2023.findings-acl.515
- Cite (ACL):
- Alba Cercas Curry and Amanda Cercas Curry. 2023. Computer says “No”: The Case Against Empathetic Conversational AI. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8123–8130, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Computer says “No”: The Case Against Empathetic Conversational AI (Cercas Curry & Cercas Curry, Findings 2023)
- PDF:
- https://preview.aclanthology.org/proper-vol2-ingestion/2023.findings-acl.515.pdf