Roux-lette at “Discharge Me!”: Reducing EHR Chart Burden with a Simple, Scalable, Clinician-Driven AI Approach

Suzanne Wendelken, Anson Antony, Rajashekar Korutla, Bhanu Pachipala, James Shanahan, Walid Saba


Abstract
Healthcare providers spend a significant amount of time reading and synthesizing electronic health records (EHRs), negatively impacting patient outcomes and causing provider burnout. Traditional supervised machine learning approaches using large language models (LLMs) to summarize clinical text have struggled due to hallucinations and lack of relevant training data. Here, we present a novel, simplified solution for the “Discharge Me!” shared task. Our approach mimics human clinical workflow, using pre-trained LLMs to answer specific questions and summarize the answers obtained from discharge summaries and other EHR sections. This method (i) avoids hallucinations through hybrid-RAG/zero-shot contextualized prompting; (ii) requires no extensive training or fine-tuning; and (iii) is adaptable to various clinical tasks.
Anthology ID:
2024.bionlp-1.63
Volume:
Proceedings of the 23rd Workshop on Biomedical Natural Language Processing
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Dina Demner-Fushman, Sophia Ananiadou, Makoto Miwa, Kirk Roberts, Junichi Tsujii
Venue:
BioNLP
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
719–723
Language:
URL:
https://aclanthology.org/2024.bionlp-1.63
DOI:
Bibkey:
Cite (ACL):
Suzanne Wendelken, Anson Antony, Rajashekar Korutla, Bhanu Pachipala, James Shanahan, and Walid Saba. 2024. Roux-lette at “Discharge Me!”: Reducing EHR Chart Burden with a Simple, Scalable, Clinician-Driven AI Approach. In Proceedings of the 23rd Workshop on Biomedical Natural Language Processing, pages 719–723, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Roux-lette at “Discharge Me!”: Reducing EHR Chart Burden with a Simple, Scalable, Clinician-Driven AI Approach (Wendelken et al., BioNLP 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/bionlp-24-ingestion/2024.bionlp-1.63.pdf