Domain-Adaptive Pretraining Methods for Dialogue Understanding
Han Wu, Kun Xu, Linfeng Song, Lifeng Jin, Haisong Zhang, Linqi Song
Abstract
Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.- Anthology ID:
- 2021.acl-short.84
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 665–669
- Language:
- URL:
- https://aclanthology.org/2021.acl-short.84
- DOI:
- 10.18653/v1/2021.acl-short.84
- Cite (ACL):
- Han Wu, Kun Xu, Linfeng Song, Lifeng Jin, Haisong Zhang, and Linqi Song. 2021. Domain-Adaptive Pretraining Methods for Dialogue Understanding. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 665–669, Online. Association for Computational Linguistics.
- Cite (Informal):
- Domain-Adaptive Pretraining Methods for Dialogue Understanding (Wu et al., ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2021.acl-short.84.pdf
- Data
- CrossWOZ