@inproceedings{hao-etal-2020-enhancing,
    title = "Enhancing Clinical {BERT} Embedding using a Biomedical Knowledge Base",
    author = "Hao, Boran  and
      Zhu, Henghui  and
      Paschalidis, Ioannis",
    editor = "Scott, Donia  and
      Bel, Nuria  and
      Zong, Chengqing",
    booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain (Online)",
    publisher = "International Committee on Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.57/",
    doi = "10.18653/v1/2020.coling-main.57",
    pages = "657--661",
    abstract = "Domain knowledge is important for building Natural Language Processing (NLP) systems for low-resource settings, such as in the clinical domain. In this paper, a novel joint training method is introduced for adding knowledge base information from the Unified Medical Language System (UMLS) into language model pre-training for some clinical domain corpus. We show that in three different downstream clinical NLP tasks, our pre-trained language model outperforms the corresponding model with no knowledge base information and other state-of-the-art models. Specifically, in a natural language inference task applied to clinical texts, our knowledge base pre-training approach improves accuracy by up to 1.7{\%}, whereas in clinical name entity recognition tasks, the F1-score improves by up to 1.0{\%}. The pre-trained models are available at \url{https://github.com/noc-lab/clinical-kb-bert}."
}Markdown (Informal)
[Enhancing Clinical BERT Embedding using a Biomedical Knowledge Base](https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.57/) (Hao et al., COLING 2020)
ACL