Domain Adaptive Dialog Generation via Meta Learning

Kun Qian, Zhou Yu


Abstract
Domain adaptation is an essential task in dialog system building because there are so many new dialog tasks created for different needs every day. Collecting and annotating training data for these new tasks is costly since it involves real user interactions. We propose a domain adaptive dialog generation method based on meta-learning (DAML). DAML is an end-to-end trainable dialog system model that learns from multiple rich-resource tasks and then adapts to new domains with minimal training samples. We train a dialog system model using multiple rich-resource single-domain dialog data by applying the model-agnostic meta-learning algorithm to dialog domain. The model is capable of learning a competitive dialog system on a new domain with only a few training examples in an efficient manner. The two-step gradient updates in DAML enable the model to learn general features across multiple tasks. We evaluate our method on a simulated dialog dataset and achieve state-of-the-art performance, which is generalizable to new tasks.
Anthology ID:
P19-1253
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2639–2649
Language:
URL:
https://aclanthology.org/P19-1253
DOI:
10.18653/v1/P19-1253
Bibkey:
Cite (ACL):
Kun Qian and Zhou Yu. 2019. Domain Adaptive Dialog Generation via Meta Learning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2639–2649, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Domain Adaptive Dialog Generation via Meta Learning (Qian & Yu, ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/P19-1253.pdf
Software:
 P19-1253.Software.zip
Video:
 https://vimeo.com/384728415
Code
 qbetterk/DAML