Out-of-Task Training for Dialog State Tracking Models
Michael Heck, Christian Geishauser, Hsien-chin Lin, Nurul Lubis, Marco Moresi, Carel van Niekerk, Milica Gasic
Abstract
Dialog state tracking (DST) suffers from severe data sparsity. While many natural language processing (NLP) tasks benefit from transfer learning and multi-task learning, in dialog these methods are limited by the amount of available data and by the specificity of dialog applications. In this work, we successfully utilize non-dialog data from unrelated NLP tasks to train dialog state trackers. This opens the door to the abundance of unrelated NLP corpora to mitigate the data sparsity issue inherent to DST.- Anthology ID:
- 2020.coling-main.596
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 6767–6774
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.596
- DOI:
- 10.18653/v1/2020.coling-main.596
- Cite (ACL):
- Michael Heck, Christian Geishauser, Hsien-chin Lin, Nurul Lubis, Marco Moresi, Carel van Niekerk, and Milica Gasic. 2020. Out-of-Task Training for Dialog State Tracking Models. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6767–6774, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- Out-of-Task Training for Dialog State Tracking Models (Heck et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.596.pdf
- Data
- GLUE, QNLI