Kun Chen
2025
DEMO: Reframing Dialogue Interaction with Fine-grained Element Modeling
Minzheng Wang
|
Xinghua Zhang
|
Kun Chen
|
Nan Xu
|
Haiyang Yu
|
Fei Huang
|
Wenji Mao
|
Yongbin Li
Findings of the Association for Computational Linguistics: ACL 2025
Large language models (LLMs) enabled dialogue systems have become one of the central modes in human-machine interaction, which bring about vast amounts of conversation logs and increasing demand for dialogue generation. The dialogue’s life-cycle spans from Prelude through Interlocution to Epilogue, encompassing rich dialogue elements. Despite large volumes of dialogue-related studies, there is a lack of systematic investigation into the dialogue stages to frame benchmark construction that covers comprehensive dialogue elements. This hinders the precise modeling, generation and assessment of LLMs-based dialogue systems. To bridge this gap, in this paper, we introduce a new research task—Dialogue Element MOdeling, including Element Awareness and Dialogue Agent Interaction, and propose a novel benchmark, DEMO, designed for a comprehensive dialogue modeling and assessment. On this basis, we further build the DEMO agent with the adept ability to model dialogue elements via imitation learning. Extensive experiments on DEMO indicate that current representative LLMs still have considerable potential for enhancement, and our DEMO agent performs well in both dialogue element modeling and out-of-domain tasks.
Search
Fix author
Co-authors
- Fei Huang 1
- Yongbin Li 1
- Wenji Mao 1
- Minzheng Wang 1
- Nan Xu 1
- show all...