Causal Document-Grounded Dialogue Pre-training

Yingxiu Zhao, Bowen Yu, Bowen Li, Haiyang Yu, Jinyang Li, Chao Wang, Fei Huang, Yongbin Li, Nevin Zhang


Abstract
The goal of document-grounded dialogue (DocGD) is to generate a response by anchoring the evidence in a supporting document in accordance with the dialogue context. This entails four causally interconnected variables. While task-specific pre-training has significantly enhanced performances on numerous downstream tasks, existing DocGD methods still rely on general pre-trained language models without a specifically tailored pre-training approach that explicitly captures the causal relationships. To address this, we present the first causally-complete dataset construction strategy for developing million-scale DocGD pre-training corpora. Additionally, we propose a causally-perturbed pre-training strategy to better capture causality by introducing perturbations on the variables and optimizing the overall causal effect. Experiments conducted on three benchmark datasets demonstrate that our causal pre-training yields substantial and consistent improvements in fully-supervised, low-resource, few-shot, and zero-shot settings.
Anthology ID:
2023.emnlp-main.443
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7160–7174
Language:
URL:
https://aclanthology.org/2023.emnlp-main.443
DOI:
10.18653/v1/2023.emnlp-main.443
Bibkey:
Cite (ACL):
Yingxiu Zhao, Bowen Yu, Bowen Li, Haiyang Yu, Jinyang Li, Chao Wang, Fei Huang, Yongbin Li, and Nevin Zhang. 2023. Causal Document-Grounded Dialogue Pre-training. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 7160–7174, Singapore. Association for Computational Linguistics.
Cite (Informal):
Causal Document-Grounded Dialogue Pre-training (Zhao et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.emnlp-main.443.pdf
Video:
 https://preview.aclanthology.org/emnlp22-frontmatter/2023.emnlp-main.443.mp4