Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach

Igor Shalyminov, Sungjin Lee, Arash Eshghi, Oliver Lemon


Abstract
Learning with minimal data is one of the key challenges in the development of practical, production-ready goal-oriented dialogue systems. In a real-world enterprise setting where dialogue systems are developed rapidly and are expected to work robustly for an ever-growing variety of domains, products, and scenarios, efficient learning from a limited number of examples becomes indispensable. In this paper, we introduce a technique to achieve state-of-the-art dialogue generation performance in a few-shot setup, without using any annotated data. We do this by leveraging background knowledge from a larger, more highly represented dialogue source — namely, the MetaLWOz dataset. We evaluate our model on the Stanford Multi-Domain Dialogue Dataset, consisting of human-human goal-oriented dialogues in in-car navigation, appointment scheduling, and weather information domains. We show that our few-shot approach achieves state-of-the art results on that dataset by consistently outperforming the previous best model in terms of BLEU and Entity F1 scores, while being more data-efficient than it by not requiring any data annotation.
Anthology ID:
W19-5904
Volume:
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue
Month:
September
Year:
2019
Address:
Stockholm, Sweden
Editors:
Satoshi Nakamura, Milica Gasic, Ingrid Zukerman, Gabriel Skantze, Mikio Nakano, Alexandros Papangelis, Stefan Ultes, Koichiro Yoshino
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
32–39
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/W19-5904/
DOI:
10.18653/v1/W19-5904
Bibkey:
Cite (ACL):
Igor Shalyminov, Sungjin Lee, Arash Eshghi, and Oliver Lemon. 2019. Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach. In Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue, pages 32–39, Stockholm, Sweden. Association for Computational Linguistics.
Cite (Informal):
Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach (Shalyminov et al., SIGDIAL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/W19-5904.pdf
Data
MetaLWOz