Improving the Transferability of Clinical Note Section Classification Models with BERT and Large Language Model Ensembles

Weipeng Zhou, Majid Afshar, Dmitriy Dligach, Yanjun Gao, Timothy Miller


Abstract
Text in electronic health records is organized into sections, and classifying those sections into section categories is useful for downstream tasks. In this work, we attempt to improve the transferability of section classification models by combining the dataset-specific knowledge in supervised learning models with the world knowledge inside large language models (LLMs). Surprisingly, we find that zero-shot LLMs out-perform supervised BERT-based models applied to out-of-domain data. We also find that their strengths are synergistic, so that a simple ensemble technique leads to additional performance gains.
Anthology ID:
2023.clinicalnlp-1.16
Volume:
Proceedings of the 5th Clinical Natural Language Processing Workshop
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Tristan Naumann, Asma Ben Abacha, Steven Bethard, Kirk Roberts, Anna Rumshisky
Venue:
ClinicalNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
125–130
Language:
URL:
https://aclanthology.org/2023.clinicalnlp-1.16
DOI:
10.18653/v1/2023.clinicalnlp-1.16
Bibkey:
Cite (ACL):
Weipeng Zhou, Majid Afshar, Dmitriy Dligach, Yanjun Gao, and Timothy Miller. 2023. Improving the Transferability of Clinical Note Section Classification Models with BERT and Large Language Model Ensembles. In Proceedings of the 5th Clinical Natural Language Processing Workshop, pages 125–130, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving the Transferability of Clinical Note Section Classification Models with BERT and Large Language Model Ensembles (Zhou et al., ClinicalNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2023.clinicalnlp-1.16.pdf