@inproceedings{chan-fan-2019-bert,
    title = "{BERT} for Question Generation",
    author = "Chan, Ying-Hong  and
      Fan, Yao-Chung",
    editor = "van Deemter, Kees  and
      Lin, Chenghua  and
      Takamura, Hiroya",
    booktitle = "Proceedings of the 12th International Conference on Natural Language Generation",
    month = oct # "–" # nov,
    year = "2019",
    address = "Tokyo, Japan",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-8624/",
    doi = "10.18653/v1/W19-8624",
    pages = "173--177",
    abstract = "In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce two neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. And, the second one remedies the first one by restructuring the BERT employment into a sequential manner for taking information from previous decoded results. Our models are trained and evaluated on the question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU4 score of existing best models from 16.85 to 18.91."
}Markdown (Informal)
[BERT for Question Generation](https://preview.aclanthology.org/iwcs-25-ingestion/W19-8624/) (Chan & Fan, INLG 2019)
ACL
- Ying-Hong Chan and Yao-Chung Fan. 2019. BERT for Question Generation. In Proceedings of the 12th International Conference on Natural Language Generation, pages 173–177, Tokyo, Japan. Association for Computational Linguistics.