GCDST: A Graph-based and Copy-augmented Multi-domain Dialogue State Tracking

Peng Wu, Bowei Zou, Ridong Jiang, AiTi Aw


Abstract
As an essential component of task-oriented dialogue systems, Dialogue State Tracking (DST) takes charge of estimating user intentions and requests in dialogue contexts and extracting substantial goals (states) from user utterances to help the downstream modules to determine the next actions of dialogue systems. For practical usages, a major challenge to constructing a robust DST model is to process a conversation with multi-domain states. However, most existing approaches trained DST on a single domain independently, ignoring the information across domains. To tackle the multi-domain DST task, we first construct a dialogue state graph to transfer structured features among related domain-slot pairs across domains. Then, we encode the graph information of dialogue states by graph convolutional networks and utilize a hard copy mechanism to directly copy historical states from the previous conversation. Experimental results show that our model improves the performances of the multi-domain DST baseline (TRADE) with the absolute joint accuracy of 2.0% and 1.0% on the MultiWOZ 2.0 and 2.1 dialogue datasets, respectively.
Anthology ID:
2020.findings-emnlp.95
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1063–1073
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.95
DOI:
10.18653/v1/2020.findings-emnlp.95
Bibkey:
Cite (ACL):
Peng Wu, Bowei Zou, Ridong Jiang, and AiTi Aw. 2020. GCDST: A Graph-based and Copy-augmented Multi-domain Dialogue State Tracking. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1063–1073, Online. Association for Computational Linguistics.
Cite (Informal):
GCDST: A Graph-based and Copy-augmented Multi-domain Dialogue State Tracking (Wu et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2020.findings-emnlp.95.pdf