PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation

Jing Gu, Qingyang Wu, Chongruo Wu, Weiyan Shi, Zhou Yu


Abstract
Large pre-trained language generation models such as GPT-2 have demonstrated their effectiveness as language priors by reaching state-of-the-art results in various language generation tasks. However, the performance of pre-trained models on task-oriented dialog tasks is still under-explored. We propose a Pre-trainedRole Alternating Language model (PRAL), explicitly designed for task-oriented conversational systems. We design several techniques: start position randomization, knowledge distillation, and history discount to improve pre-training performance. In addition, we introduce a high-quality large-scale task-oriented dialog pre-training dataset by post-prossessing13 dialog datasets. We effectively adapt PRALon three downstream tasks. The results show that PRAL outperforms or is on par with state-of-the-art models.
Anthology ID:
2021.acl-short.40
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
305–313
Language:
URL:
https://aclanthology.org/2021.acl-short.40
DOI:
10.18653/v1/2021.acl-short.40
Bibkey:
Cite (ACL):
Jing Gu, Qingyang Wu, Chongruo Wu, Weiyan Shi, and Zhou Yu. 2021. PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 305–313, Online. Association for Computational Linguistics.
Cite (Informal):
PRAL: A Tailored Pre-Training Model for Task-Oriented Dialog Generation (Gu et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.acl-short.40.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.acl-short.40.mp4