@inproceedings{asada-miwa-2023-bionart,
    title = "{B}io{NART}: A Biomedical Non-{A}uto{R}egressive Transformer for Natural Language Generation",
    author = "Asada, Masaki  and
      Miwa, Makoto",
    editor = "Demner-fushman, Dina  and
      Ananiadou, Sophia  and
      Cohen, Kevin",
    booktitle = "Proceedings of the 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.bionlp-1.34/",
    doi = "10.18653/v1/2023.bionlp-1.34",
    pages = "369--376",
    abstract = "We propose a novel Biomedical domain-specific Non-AutoRegressive Transformer model for natural language generation: BioNART. Our BioNART is based on an encoder-decoder model, and both encoder and decoder are compatible with widely used BERT architecture, which allows benefiting from publicly available pre-trained biomedical language model checkpoints. We performed additional pre-training and fine-tuned BioNART on biomedical summarization and doctor-patient dialogue tasks. Experimental results show that our BioNART achieves about 94{\%} of the ROUGE score to the pre-trained autoregressive model while realizing an 18 times faster inference speed on the iCliniq dataset."
}Markdown (Informal)
[BioNART: A Biomedical Non-AutoRegressive Transformer for Natural Language Generation](https://preview.aclanthology.org/ingest-emnlp/2023.bionlp-1.34/) (Asada & Miwa, BioNLP 2023)
ACL