@inproceedings{chan-fan-2019-recurrent,
    title = "A Recurrent {BERT}-based Model for Question Generation",
    author = "Chan, Ying-Hong  and
      Fan, Yao-Chung",
    editor = "Fisch, Adam  and
      Talmor, Alon  and
      Jia, Robin  and
      Seo, Minjoon  and
      Choi, Eunsol  and
      Chen, Danqi",
    booktitle = "Proceedings of the 2nd Workshop on Machine Reading for Question Answering",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-5821/",
    doi = "10.18653/v1/D19-5821",
    pages = "154--162",
    abstract = "In this study, we investigate the employment of the pre-trained BERT language model to tackle question generation tasks. We introduce three neural architectures built on top of BERT for question generation tasks. The first one is a straightforward BERT employment, which reveals the defects of directly using BERT for text generation. Accordingly, we propose another two models by restructuring our BERT employment into a sequential manner for taking information from previous decoded results. Our models are trained and evaluated on the recent question-answering dataset SQuAD. Experiment results show that our best model yields state-of-the-art performance which advances the BLEU 4 score of the existing best models from 16.85 to 22.17."
}Markdown (Informal)
[A Recurrent BERT-based Model for Question Generation](https://preview.aclanthology.org/iwcs-25-ingestion/D19-5821/) (Chan & Fan, 2019)
ACL