Enhancing Multilingual Document-Grounded Dialogue Using Cascaded Prompt-Based Post-Training Models

Jun Liu, Shuang Cheng, Zineng Zhou, Yang Gu, Jian Ye, Haiyong Luo


Abstract
The Dialdoc23 shared task presents a Multilingual Document-Grounded Dialogue Systems (MDGDS) challenge, where system responses are generated in multiple languages using user’s queries, historical dialogue records and relevant passages. A major challenge for this task is the limited training data available in low-resource languages such as French and Vietnamese. In this paper, we propose Cascaded Prompt-based Post-training Models, dividing the task into three subtasks: Retrieval, Reranking and Generation. We conduct post-training on high-resource language such as English and Chinese to enhance performance of low-resource languages by using the similarities of languages. Additionally, we utilize the prompt method to activate model’s ability on diverse languages within the dialogue domain and explore which prompt is a good prompt. Our comprehensive experiments demonstrate the effectiveness of our proposed methods, which achieved the first place on the leaderboard with a total score of 215.40 in token-level F1, SacreBleu, and Rouge-L metrics.
Anthology ID:
2023.dialdoc-1.5
Volume:
Proceedings of the Third DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Smaranda Muresan, Vivian Chen, Kennington Casey, Vandyke David, Dethlefs Nina, Inoue Koji, Ekstedt Erik, Ultes Stefan
Venue:
dialdoc
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
44–51
Language:
URL:
https://aclanthology.org/2023.dialdoc-1.5
DOI:
10.18653/v1/2023.dialdoc-1.5
Bibkey:
Cite (ACL):
Jun Liu, Shuang Cheng, Zineng Zhou, Yang Gu, Jian Ye, and Haiyong Luo. 2023. Enhancing Multilingual Document-Grounded Dialogue Using Cascaded Prompt-Based Post-Training Models. In Proceedings of the Third DialDoc Workshop on Document-grounded Dialogue and Conversational Question Answering, pages 44–51, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Enhancing Multilingual Document-Grounded Dialogue Using Cascaded Prompt-Based Post-Training Models (Liu et al., dialdoc 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.dialdoc-1.5.pdf