@inproceedings{lee-etal-2019-ncuee,
    title = "{NCUEE} at {MEDIQA} 2019: Medical Text Inference Using Ensemble {BERT}-{B}i{LSTM}-Attention Model",
    author = "Lee, Lung-Hao  and
      Lu, Yi  and
      Chen, Po-Han  and
      Lee, Po-Lei  and
      Shyu, Kuo-Kai",
    editor = "Demner-Fushman, Dina  and
      Cohen, Kevin Bretonnel  and
      Ananiadou, Sophia  and
      Tsujii, Junichi",
    booktitle = "Proceedings of the 18th BioNLP Workshop and Shared Task",
    month = aug,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-5058/",
    doi = "10.18653/v1/W19-5058",
    pages = "528--532",
    abstract = "This study describes the model design of the NCUEE system for the MEDIQA challenge at the ACL-BioNLP 2019 workshop. We use the BERT (Bidirectional Encoder Representations from Transformers) as the word embedding method to integrate the BiLSTM (Bidirectional Long Short-Term Memory) network with an attention mechanism for medical text inferences. A total of 42 teams participated in natural language inference task at MEDIQA 2019. Our best accuracy score of 0.84 ranked the top-third among all submissions in the leaderboard."
}Markdown (Informal)
[NCUEE at MEDIQA 2019: Medical Text Inference Using Ensemble BERT-BiLSTM-Attention Model](https://preview.aclanthology.org/iwcs-25-ingestion/W19-5058/) (Lee et al., BioNLP 2019)
ACL