SWI: Speaking with Intent in Large Language Models

Yuwei Yin, Eunjeong Hwang, Giuseppe Carenini


Abstract
Intent, typically clearly formulated and planned, functions as a cognitive framework for communication and problem-solving. This paper introduces the concept of Speaking with Intent (SWI) in large language models (LLMs), where the explicitly generated intent encapsulates the model’s underlying intention and provides high-level planning to guide subsequent analysis and action. By emulating deliberate and purposeful thoughts in the human mind, SWI is hypothesized to enhance the reasoning capabilities and generation quality of LLMs. Extensive experiments on text summarization, multi-task question answering, and mathematical reasoning benchmarks consistently demonstrate the effectiveness and generalizability of Speaking with Intent over direct generation without explicit intent. Further analysis corroborates the generalizability of SWI under different experimental settings. Moreover, human evaluations verify the coherence, effectiveness, and interpretability of the intent produced by SWI. The promising results in enhancing LLMs with explicit intents pave a new avenue for boosting LLMs’ generation and reasoning abilities with cognitive notions.
Anthology ID:
2025.inlg-main.39
Volume:
Proceedings of the 18th International Natural Language Generation Conference
Month:
October
Year:
2025
Address:
Hanoi, Vietnam
Editors:
Lucie Flek, Shashi Narayan, Lê Hồng Phương, Jiahuan Pei
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
684–698
Language:
URL:
https://preview.aclanthology.org/ingest-luhme/2025.inlg-main.39/
DOI:
Bibkey:
Cite (ACL):
Yuwei Yin, Eunjeong Hwang, and Giuseppe Carenini. 2025. SWI: Speaking with Intent in Large Language Models. In Proceedings of the 18th International Natural Language Generation Conference, pages 684–698, Hanoi, Vietnam. Association for Computational Linguistics.
Cite (Informal):
SWI: Speaking with Intent in Large Language Models (Yin et al., INLG 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-luhme/2025.inlg-main.39.pdf