Steering Conversational Large Language Models for Long Emotional Support Conversations

Navid Madani, Rohini Srihari


Abstract
In this study, we address the challenge of consistently following emotional support strategies in long conversations by large language models (LLMs). We introduce the Strategy-Relevant Attention (SRA) metric, a model-agnostic measure designed to evaluate the effectiveness of LLMs in adhering to strategic prompts in emotional support contexts. By analyzing conversations within the Emotional Support Conversations dataset (ESConv) using LLaMA models, we demonstrate that SRA is significantly correlated with a model’s ability to sustain the outlined strategy throughout the interactions. Our findings reveal that the application of SRA-informed prompts leads to enhanced strategic adherence, resulting in conversations that more reliably exhibit the desired emotional support strategies over longer conversations. Furthermore, we contribute a comprehensive, multi-branch synthetic conversation dataset for ESConv, featuring a variety of strategy continuations informed by our optimized prompting method. The code and data are publicly available on our Github.
Anthology ID:
2025.sicon-1.9
Volume:
Proceedings of the Third Workshop on Social Influence in Conversations (SICon 2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
James Hale, Brian Deuksin Kwon, Ritam Dutt
Venues:
SICon | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
109–123
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.sicon-1.9/
DOI:
Bibkey:
Cite (ACL):
Navid Madani and Rohini Srihari. 2025. Steering Conversational Large Language Models for Long Emotional Support Conversations. In Proceedings of the Third Workshop on Social Influence in Conversations (SICon 2025), pages 109–123, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Steering Conversational Large Language Models for Long Emotional Support Conversations (Madani & Srihari, SICon 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.sicon-1.9.pdf