Synthesize, Prompt and Transfer: Zero-shot Conversational Question Generation with Pre-trained Language Model

Hongwei Zeng, Bifan Wei, Jun Liu, Weiping Fu


Abstract
Conversational question generation aims to generate questions that depend on both context and conversation history. Conventional works utilizing deep learning have shown promising results, but heavily rely on the availability of large-scale annotated conversations. In this paper, we introduce a more realistic and less explored setting, Zero-shot Conversational Question Generation (ZeroCQG), which requires no human-labeled conversations for training. To solve ZeroCQG, we propose a multi-stage knowledge transfer framework, Synthesize, Prompt, and trAnsfer with pRe-Trained lAnguage model (SPARTA) to effectively leverage knowledge from single-turn question generation instances. To validate the zero-shot performance of SPARTA, we conduct extensive experiments on three conversational datasets: CoQA, QuAC, and DoQA by transferring knowledge from three single-turn datasets: MS MARCO, NewsQA, and SQuAD. The experimental results demonstrate the superior performance of our method. Specifically, SPARTA has achieved 14.81 BLEU-4 (88.2% absolute improvement compared to T5) in CoQA with knowledge transferred from SQuAD.
Anthology ID:
2023.acl-long.500
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8989–9010
Language:
URL:
https://aclanthology.org/2023.acl-long.500
DOI:
10.18653/v1/2023.acl-long.500
Bibkey:
Cite (ACL):
Hongwei Zeng, Bifan Wei, Jun Liu, and Weiping Fu. 2023. Synthesize, Prompt and Transfer: Zero-shot Conversational Question Generation with Pre-trained Language Model. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8989–9010, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Synthesize, Prompt and Transfer: Zero-shot Conversational Question Generation with Pre-trained Language Model (Zeng et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.500.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.500.mp4