Dynamic Schema Graph Fusion Network for Multi-Domain Dialogue State Tracking

Yue Feng, Aldo Lipani, Fanghua Ye, Qiang Zhang, Emine Yilmaz


Abstract
Dialogue State Tracking (DST) aims to keep track of users’ intentions during the course of a conversation. In DST, modelling the relations among domains and slots is still an under-studied problem. Existing approaches that have considered such relations generally fall short in: (1) fusing prior slot-domain membership relations and dialogue-aware dynamic slot relations explicitly, and (2) generalizing to unseen domains. To address these issues, we propose a novel Dynamic Schema Graph Fusion Network (DSGFNet), which generates a dynamic schema graph to explicitly fuse the prior slot-domain membership relations and dialogue-aware dynamic slot relations. It also uses the schemata to facilitate knowledge transfer to new domains. DSGFNet consists of a dialogue utterance encoder, a schema graph encoder, a dialogue-aware schema graph evolving network, and a schema graph enhanced dialogue state decoder. Empirical results on benchmark datasets (i.e., SGD, MultiWOZ2.1, and MultiWOZ2.2), show that DSGFNet outperforms existing methods.
Anthology ID:
2022.acl-long.10
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
115–126
Language:
URL:
https://aclanthology.org/2022.acl-long.10
DOI:
10.18653/v1/2022.acl-long.10
Bibkey:
Cite (ACL):
Yue Feng, Aldo Lipani, Fanghua Ye, Qiang Zhang, and Emine Yilmaz. 2022. Dynamic Schema Graph Fusion Network for Multi-Domain Dialogue State Tracking. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 115–126, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Dynamic Schema Graph Fusion Network for Multi-Domain Dialogue State Tracking (Feng et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.10.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.10.mp4
Data
SGD