DivTOD: Unleashing the Power of LLMs for Diversifying Task-Oriented Dialogue Representations
Weihao Zeng, Dayuan Fu, Keqing He, Yejie Wang, Yukai Xu, Weiran Xu
Abstract
Language models pre-trained on general text have achieved impressive results in diverse fields. Yet, the distinct linguistic characteristics of task-oriented dialogues (TOD) compared to general text limit the practical utility of existing language models. Current task-oriented dialogue pre-training methods overlook the one-to-many property of conversations, where multiple responses can be appropriate given the same conversation context.In this paper, we propose a novel dialogue pre-training model called DivTOD, which collaborates with LLMs to learn diverse task-oriented dialogue representations. DivTOD guides LLMs in transferring diverse knowledge to smaller models while removing domain knowledge that contradicts task-oriented dialogues. Experiments show that our model outperforms strong TOD baselines on various downstream dialogue tasks and learns the intrinsic diversity of task-oriented dialogues.- Anthology ID:
- 2024.findings-naacl.51
- Volume:
- Findings of the Association for Computational Linguistics: NAACL 2024
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Kevin Duh, Helena Gomez, Steven Bethard
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 800–813
- Language:
- URL:
- https://aclanthology.org/2024.findings-naacl.51
- DOI:
- 10.18653/v1/2024.findings-naacl.51
- Cite (ACL):
- Weihao Zeng, Dayuan Fu, Keqing He, Yejie Wang, Yukai Xu, and Weiran Xu. 2024. DivTOD: Unleashing the Power of LLMs for Diversifying Task-Oriented Dialogue Representations. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 800–813, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- DivTOD: Unleashing the Power of LLMs for Diversifying Task-Oriented Dialogue Representations (Zeng et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2024.findings-naacl.51.pdf