Multi-Domain Dialogue State Tracking with Disentangled Domain-Slot Attention

Longfei Yang, Jiyi Li, Sheng Li, Takahiro Shinozaki


Abstract
As the core of task-oriented dialogue systems, dialogue state tracking (DST) is designed to track the dialogue state through the conversation between users and systems. Multi-domain DST has been an important challenge in which the dialogue states across multiple domains need to consider. In recent mainstream approaches, each domain and slot are aggregated and regarded as a single query feeding into attention with the dialogue history to obtain domain-slot specific representations. In this work, we propose disentangled domain-slot attention for multi-domain dialogue state tracking. The proposed approach disentangles the domain-slot specific information extraction in a flexible and context-dependent manner by separating the query about domains and slots in the attention component. Through a series of experiments on MultiWOZ 2.0 and MultiWOZ 2.4 datasets, we demonstrate that our proposed approach outperforms the standard multi-head attention with aggregated domain-slot query.
Anthology ID:
2023.findings-acl.304
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4928–4938
Language:
URL:
https://aclanthology.org/2023.findings-acl.304
DOI:
10.18653/v1/2023.findings-acl.304
Bibkey:
Cite (ACL):
Longfei Yang, Jiyi Li, Sheng Li, and Takahiro Shinozaki. 2023. Multi-Domain Dialogue State Tracking with Disentangled Domain-Slot Attention. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4928–4938, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Multi-Domain Dialogue State Tracking with Disentangled Domain-Slot Attention (Yang et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.304.pdf