Abstract
Task-oriented dialogue systems help users accomplish tasks such as booking a movie ticket and ordering food via conversation. Generative models parameterized by a deep neural network are widely used for next turn response generation in such systems. It is natural for users of the system to want to accomplish multiple tasks within the same conversation, but the ability of generative models to compose multiple tasks is not well studied. In this work, we begin by studying the effect of training human-human task-oriented dialogues towards improving the ability to compose multiple tasks on Transformer generative models. To that end, we propose and explore two solutions: (1) creating synthetic multiple task dialogue data for training from human-human single task dialogue and (2) forcing the encoder representation to be invariant to single and multiple task dialogues using an auxiliary loss. The results from our experiments highlight the difficulty of even the sophisticated variant of transformer model in learning to compose multiple tasks from single task dialogues.- Anthology ID:
- 2020.insights-1.6
- Volume:
- Proceedings of the First Workshop on Insights from Negative Results in NLP
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- insights
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 41–47
- Language:
- URL:
- https://aclanthology.org/2020.insights-1.6
- DOI:
- 10.18653/v1/2020.insights-1.6
- Cite (ACL):
- Prasanna Parthasarathi, Sharan Narang, and Arvind Neelakantan. 2020. On Task-Level Dialogue Composition of Generative Transformer Model. In Proceedings of the First Workshop on Insights from Negative Results in NLP, pages 41–47, Online. Association for Computational Linguistics.
- Cite (Informal):
- On Task-Level Dialogue Composition of Generative Transformer Model (Parthasarathi et al., insights 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.insights-1.6.pdf
- Code
- ppartha03/Dialogue-Compositionality-of-Generative-Transformer