Getting to Production with Few-shot Natural Language Generation Models

Peyman Heidari, Arash Einolghozati, Shashank Jain, Soumya Batra, Lee Callender, Ankit Arun, Shawn Mei, Sonal Gupta, Pinar Donmez, Vikas Bhardwaj, Anuj Kumar, Michael White


Abstract
In this paper, we study the utilization of pre-trained language models to enable few-shotNatural Language Generation (NLG) in task-oriented dialog systems. We introduce a system consisting of iterative self-training and an extensible mini-template framework that textualizes the structured input data into semi-natural text to fully take advantage of pre-trained language models. We compare var-ious representations of NLG models’ input and output and show that transforming the input and output to be similar to what the language model has seen before during pre-training improves the model’s few-shot performance substantially. We show that neural mod-els can be trained with as few as 300 annotated examples while providing high fidelity, considerably lowering the resource requirements for standing up a new domain or language.This level of data efficiency removes the need for crowd-sourced data collection resulting in higher quality data annotated by expert linguists. In addition, model maintenance and debugging processes will improve in this few-shot setting. Finally, we explore distillation and using a caching system to satisfy latency requirements of real-world systems.
Anthology ID:
2021.sigdial-1.8
Volume:
Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
July
Year:
2021
Address:
Singapore and Online
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
66–76
Language:
URL:
https://aclanthology.org/2021.sigdial-1.8
DOI:
Bibkey:
Cite (ACL):
Peyman Heidari, Arash Einolghozati, Shashank Jain, Soumya Batra, Lee Callender, Ankit Arun, Shawn Mei, Sonal Gupta, Pinar Donmez, Vikas Bhardwaj, Anuj Kumar, and Michael White. 2021. Getting to Production with Few-shot Natural Language Generation Models. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 66–76, Singapore and Online. Association for Computational Linguistics.
Cite (Informal):
Getting to Production with Few-shot Natural Language Generation Models (Heidari et al., SIGDIAL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.sigdial-1.8.pdf
Video:
 https://www.youtube.com/watch?v=JKZ_96erOyY