BERT for Question Generation

Ying-Hong Chan, Yao-Chung Fan


Abstract
In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce two neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. And, the second one remedies the first one by restructuring the BERT employment into a sequential manner for taking information from previous decoded results. Our models are trained and evaluated on the question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU4 score of existing best models from 16.85 to 18.91.
Anthology ID:
W19-8624
Volume:
Proceedings of the 12th International Conference on Natural Language Generation
Month:
October–November
Year:
2019
Address:
Tokyo, Japan
Editors:
Kees van Deemter, Chenghua Lin, Hiroya Takamura
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
173–177
Language:
URL:
https://aclanthology.org/W19-8624
DOI:
10.18653/v1/W19-8624
Bibkey:
Cite (ACL):
Ying-Hong Chan and Yao-Chung Fan. 2019. BERT for Question Generation. In Proceedings of the 12th International Conference on Natural Language Generation, pages 173–177, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
BERT for Question Generation (Chan & Fan, INLG 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/W19-8624.pdf