@inproceedings{shalyminov-etal-2019-shot,
    title = "Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach",
    author = "Shalyminov, Igor  and
      Lee, Sungjin  and
      Eshghi, Arash  and
      Lemon, Oliver",
    editor = "Nakamura, Satoshi  and
      Gasic, Milica  and
      Zukerman, Ingrid  and
      Skantze, Gabriel  and
      Nakano, Mikio  and
      Papangelis, Alexandros  and
      Ultes, Stefan  and
      Yoshino, Koichiro",
    booktitle = "Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue",
    month = sep,
    year = "2019",
    address = "Stockholm, Sweden",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-5904/",
    doi = "10.18653/v1/W19-5904",
    pages = "32--39",
    abstract = "Learning with minimal data is one of the key challenges in the development of practical, production-ready goal-oriented dialogue systems. In a real-world enterprise setting where dialogue systems are developed rapidly and are expected to work robustly for an ever-growing variety of domains, products, and scenarios, efficient learning from a limited number of examples becomes indispensable. In this paper, we introduce a technique to achieve state-of-the-art dialogue generation performance in a few-shot setup, without using any annotated data. We do this by leveraging background knowledge from a larger, more highly represented dialogue source {---} namely, the MetaLWOz dataset. We evaluate our model on the Stanford Multi-Domain Dialogue Dataset, consisting of human-human goal-oriented dialogues in in-car navigation, appointment scheduling, and weather information domains. We show that our few-shot approach achieves state-of-the art results on that dataset by consistently outperforming the previous best model in terms of BLEU and Entity F1 scores, while being more data-efficient than it by not requiring any data annotation."
}Markdown (Informal)
[Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach](https://preview.aclanthology.org/iwcs-25-ingestion/W19-5904/) (Shalyminov et al., SIGDIAL 2019)
ACL