GTM: A Generative Triple-wise Model for Conversational Question Generation

Lei Shen, Fandong Meng, Jinchao Zhang, Yang Feng, Jie Zhou


Abstract
Generating some appealing questions in open-domain conversations is an effective way to improve human-machine interactions and lead the topic to a broader or deeper direction. To avoid dull or deviated questions, some researchers tried to utilize answer, the “future” information, to guide question generation. However, they separate a post-question-answer (PQA) triple into two parts: post-question (PQ) and question-answer (QA) pairs, which may hurt the overall coherence. Besides, the QA relationship is modeled as a one-to-one mapping that is not reasonable in open-domain conversations. To tackle these problems, we propose a generative triple-wise model with hierarchical variations for open-domain conversational question generation (CQG). Latent variables in three hierarchies are used to represent the shared background of a triple and one-to-many semantic mappings in both PQ and QA pairs. Experimental results on a large-scale CQG dataset show that our method significantly improves the quality of questions in terms of fluency, coherence and diversity over competitive baselines.
Anthology ID:
2021.acl-long.271
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3495–3506
Language:
URL:
https://aclanthology.org/2021.acl-long.271
DOI:
10.18653/v1/2021.acl-long.271
Bibkey:
Cite (ACL):
Lei Shen, Fandong Meng, Jinchao Zhang, Yang Feng, and Jie Zhou. 2021. GTM: A Generative Triple-wise Model for Conversational Question Generation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3495–3506, Online. Association for Computational Linguistics.
Cite (Informal):
GTM: A Generative Triple-wise Model for Conversational Question Generation (Shen et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.acl-long.271.pdf
Optional supplementary material:
 2021.acl-long.271.OptionalSupplementaryMaterial.zip
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.acl-long.271.mp4