Soloist: Building Task Bots at Scale with Transfer Learning and Machine Teaching
Baolin Peng, Chunyuan Li, Jinchao Li, Shahin Shayandeh, Lars Liden, Jianfeng Gao
Abstract
We present a new method, Soloist,1 that uses transfer learning and machine teaching to build task bots at scale. We parameterize classical modular task-oriented dialog systems using a Transformer-based auto-regressive language model, which subsumes different dialog modules into a single neural model. We pre-train, on heterogeneous dialog corpora, a task-grounded response generation model, which can generate dialog responses grounded in user goals and real-world knowledge for task completion. The pre-trained model can be efficiently adapted to accomplish new tasks with a handful of task-specific dialogs via machine teaching, where training samples are generated by human teachers interacting with the system. Experiments show that (i)Soloist creates new state-of-the-art on well-studied task-oriented dialog benchmarks, including CamRest676 and MultiWOZ; (ii) in the few-shot fine-tuning settings, Soloist significantly outperforms existing methods; and (iii) the use of machine teaching substantially reduces the labeling cost of fine-tuning. The pre-trained models and codes are available at https://aka.ms/soloist.- Anthology ID:
- 2021.tacl-1.49
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 9
- Month:
- Year:
- 2021
- Address:
- Cambridge, MA
- Editors:
- Brian Roark, Ani Nenkova
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 807–824
- Language:
- URL:
- https://aclanthology.org/2021.tacl-1.49
- DOI:
- 10.1162/tacl_a_00399
- Cite (ACL):
- Baolin Peng, Chunyuan Li, Jinchao Li, Shahin Shayandeh, Lars Liden, and Jianfeng Gao. 2021. Soloist: Building Task Bots at Scale with Transfer Learning and Machine Teaching. Transactions of the Association for Computational Linguistics, 9:807–824.
- Cite (Informal):
- Soloist: Building Task Bots at Scale with Transfer Learning and Machine Teaching (Peng et al., TACL 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.tacl-1.49.pdf