Abstract
Pre-trained language models (PrLMs) have demonstrated superior performance due to their strong ability to learn universal language representations from self-supervised pre-training. However, even with the help of the powerful PrLMs, it is still challenging to effectively capture task-related knowledge from dialogue texts which are enriched by correlations among speaker-aware utterances. In this work, we present SPIDER, Structural Pre-traIned DialoguE Reader, to capture dialogue exclusive features. To simulate the dialogue-like features, we propose two training objectives in addition to the original LM objectives: 1) utterance order restoration, which predicts the order of the permuted utterances in dialogue context; 2) sentence backbone regularization, which regularizes the model to improve the factual correctness of summarized subject-verb-object triplets. Experimental results on widely used dialogue benchmarks verify the effectiveness of the newly introduced self-supervised tasks.- Anthology ID:
- 2021.acl-long.399
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5134–5145
- Language:
- URL:
- https://aclanthology.org/2021.acl-long.399
- DOI:
- 10.18653/v1/2021.acl-long.399
- Cite (ACL):
- Zhuosheng Zhang and Hai Zhao. 2021. Structural Pre-training for Dialogue Comprehension. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 5134–5145, Online. Association for Computational Linguistics.
- Cite (Informal):
- Structural Pre-training for Dialogue Comprehension (Zhang & Zhao, ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2021.acl-long.399.pdf
- Data
- Douban, Douban Conversation Corpus, MuTual