Talar Hagopian
2024
Leveraging Prompt-Learning for Structured Information Extraction from Crohn’s Disease Radiology Reports in a Low-Resource Language
Liam Hazan
|
Naama Gavrielov
|
Roi Reichart
|
Talar Hagopian
|
Mary-Louise Greer
|
Ruth Cytter-Kuint
|
Gili Focht
|
Dan Turner
|
Moti Freiman
Proceedings of the 6th Clinical Natural Language Processing Workshop
Automatic conversion of free-text radiology reports into structured data using Natural Language Processing (NLP) techniques is crucial for analyzing diseases on a large scale. While effective for tasks in widely spoken languages like English, generative large language models (LLMs) typically underperform with less common languages and can pose potential risks to patient privacy. Fine-tuning local NLP models is hindered by the skewed nature of real-world medical datasets, where rare findings represent a significant data imbalance. We introduce SMP-BERT, a novel prompt learning method that leverages the structured nature of reports to overcome these challenges. In our studies involving a substantial collection of Crohn’s disease radiology reports in Hebrew (over 8,000 patients and 10,000 reports), SMP-BERT greatly surpassed traditional fine-tuning methods in performance, notably in detecting infrequent conditions (AUC: 0.99 vs 0.94, F1: 0.84 vs 0.34). SMP-BERT empowers more accurate AI diagnostics available for low-resource languages.
Search
Co-authors
- Liam Hazan 1
- Naama Gavrielov 1
- Roi Reichart 1
- Mary-Louise Greer 1
- Ruth Cytter-Kuint 1
- show all...