MTGP: Multi-turn Target-oriented Dialogue Guided by Generative Global Path with Flexible Turns

Anqi Liu, Bo Wang, Yue Tan, Dongming Zhao, Kun Huang, Ruifang He, Yuexian Hou


Abstract
Target-oriented dialogue guides the dialogue to a target quickly and smoothly. The latest approaches focus on global planning, which plans toward the target before the conversation instead of adopting a greedy strategy during the conversation. However, the global plan in existing works is fixed to certain turns by generating paths with certain nodes, which limits the optimization of turns and coherence of the target-oriented process. Toward flexible global planning, we propose to generate a global path as a natural language sentence instead of a sequence of nodes. With this path, the dialog is guided to the target with flexible turns of dialog. For model training, we also extract targetoriented dialogues from the chit-chat corpus with a knowledge graph. We conduct experiments on three datasets and simulate scenarios with and without user participation. The results show that our method has fewer turns, more coherent semantics, and a higher success rate in reaching the target than baselines.
Anthology ID:
2023.findings-acl.18
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
259–271
Language:
URL:
https://aclanthology.org/2023.findings-acl.18
DOI:
10.18653/v1/2023.findings-acl.18
Bibkey:
Cite (ACL):
Anqi Liu, Bo Wang, Yue Tan, Dongming Zhao, Kun Huang, Ruifang He, and Yuexian Hou. 2023. MTGP: Multi-turn Target-oriented Dialogue Guided by Generative Global Path with Flexible Turns. In Findings of the Association for Computational Linguistics: ACL 2023, pages 259–271, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
MTGP: Multi-turn Target-oriented Dialogue Guided by Generative Global Path with Flexible Turns (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.findings-acl.18.pdf