DG2: Data Augmentation Through Document Grounded Dialogue Generation

Qingyang Wu, Song Feng, Derek Chen, Sachindra Joshi, Luis Lastras, Zhou Yu


Abstract
Collecting data for training dialog systems can be extremely expensive due to the involvement of human participants and the need for extensive annotation. Especially in document-grounded dialog systems, human experts need to carefully read the unstructured documents to answer the users’ questions. As a result, existing document-grounded dialog datasets are relatively small-scale and obstruct the effective training of dialogue systems. In this paper, we propose an automatic data augmentation technique grounded on documents through a generative dialogue model. The dialogue model consists of a user bot and agent bot that can synthesize diverse dialogues given an input document, which is then used to train a downstream model. When supplementing the original dataset, our method achieves significant improvement over traditional data augmentation methods. We also achieve great performance in the low-resource setting.
Anthology ID:
2022.sigdial-1.21
Volume:
Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue
Month:
September
Year:
2022
Address:
Edinburgh, UK
Editors:
Oliver Lemon, Dilek Hakkani-Tur, Junyi Jessy Li, Arash Ashrafzadeh, Daniel Hernández Garcia, Malihe Alikhani, David Vandyke, Ondřej Dušek
Venue:
SIGDIAL
SIG:
SIGDIAL
Publisher:
Association for Computational Linguistics
Note:
Pages:
204–216
Language:
URL:
https://aclanthology.org/2022.sigdial-1.21
DOI:
10.18653/v1/2022.sigdial-1.21
Bibkey:
Cite (ACL):
Qingyang Wu, Song Feng, Derek Chen, Sachindra Joshi, Luis Lastras, and Zhou Yu. 2022. DG2: Data Augmentation Through Document Grounded Dialogue Generation. In Proceedings of the 23rd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 204–216, Edinburgh, UK. Association for Computational Linguistics.
Cite (Informal):
DG2: Data Augmentation Through Document Grounded Dialogue Generation (Wu et al., SIGDIAL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2022.sigdial-1.21.pdf
Video:
 https://youtu.be/mBxsj_qAH80
Data
CoQADoc2DialQuACShARCdoc2dial